Your Brain Has Been Running 18th-Century Software This Whole Time

Your Brain Has Been Running 18th-Century Software This Whole Time

And it turns out, that’s actually a good thing


Most of us like to think we’re rational. We gather evidence, weigh our options, and arrive at sensible conclusions. We’re open-minded, of course — but once we’ve made up our mind about something, we hold our ground. That’s not stubbornness, that’s conviction.

Here’s the uncomfortable truth: that’s not rationality. That’s just stubbornness with better PR.

The real logic of good thinking — the kind that powers artificial intelligence, cracks wartime codes, and saves lives in hospitals — was figured out by an obscure English clergyman over 250 years ago. His name was Thomas Bayes, and his deceptively simple idea might be the most practical mental upgrade available to anyone willing to sit with a little discomfort.

This isn’t a post about math. It’s about four ideas that genuinely change how you see the world — and why the most intelligent thing you can do is be willing to change your mind.


1. The Man Who Changed Everything (And Nobody Knew It)

Before we get to the ideas, the origin story deserves a moment — because it’s genuinely strange.

The Reverend Thomas Bayes was a clergyman from Tunbridge Wells. Not exactly the biography you’d expect from the father of modern probabilistic reasoning. More striking still: he was entirely unknown for his mathematical work during his lifetime. His landmark theory was published posthumously, two years after his death in 1761. He never lived to see the revolution he started.

And what a revolution it was. His ideas quietly slipped into some of the most consequential moments in modern history. During World War II, Alan Turing and his team at Bletchley Park used Bayesian principles to crack the Nazi Enigma code — treating their guesses about the machine’s settings not as fixed assumptions but as fluid, updateable estimates. Every time a new pattern emerged from intercepted messages, they revised their probability. They didn’t need to be certain. They just needed to be less wrong than yesterday.

That’s the spirit of Bayesian reasoning in a single sentence.

What makes this origin story so compelling isn’t just the drama of it — it’s the irony. A ghost of a man, unknown in his own era, laid the intellectual foundation for machine learning, medical diagnostics, spam filters, and pandemic modelling. His face arguably belongs on currency. Instead, most people can’t pronounce his name.

The takeaway here isn’t just biographical trivia. It’s a reminder that the most powerful ideas are often the quietest ones — the ones that don’t announce themselves but simply… work.


2. Your Beliefs Are Not Facts. They’re Probabilities in Disguise.

Here is the core of Bayesian thinking, stripped to its bones: you should never hold a belief as an absolute certainty. You should hold it as a probability — and you should update that probability the moment new evidence appears.

That sounds reasonable enough until you actually try to live by it. Because most of us, in practice, treat our beliefs as binary. Either something is true or it isn’t. Either this person is trustworthy or they’re not. Either this plan is a good idea or it’s foolish.

Bayes’ great gift was the formula — and more importantly, the mindset — for moving beyond this. His theory provides the mechanism by which a prior belief gets revised in light of new information to produce a posterior belief. Fancy words for something your spam filter has been doing effortlessly for years.

Your email’s spam filter doesn’t know a message is junk. It assigns a probability. A suspicious link raises the likelihood. A Nigerian prince raising it further. A familiar sender lowers it. The filter is constantly learning, constantly revising — never fully certain, always updating.

“Essentially, he produced a theory for learning from experience.”

That’s it. That’s the whole thing. Not a theory for being right. A theory for learning.

The practical power of this reframe is enormous. If you stop asking “Is this true?” and start asking “How much does this new evidence change what I already believe?”, you exit the all-or-nothing trap that distorts so much human thinking. You don’t have to abandon your convictions — you just hold them with open hands rather than clenched fists.

Think about how differently you’d approach an argument, a business decision, or a personal relationship if your goal wasn’t to defend a position but to accurately track the probability that you’re right. The latter is actually much more powerful. It makes you harder to manipulate and easier to correct when you need to be.


3. The Medical Test Paradox That Will Make You Question Everything

This is where Bayesian reasoning stops being abstract and starts being genuinely life-altering. Brace yourself, because the following is statistically true and deeply counterintuitive.

Imagine you receive a positive result on a mammogram. The test is highly accurate: it detects 90% of actual cancers and correctly clears 97% of healthy patients. You would be forgiven for assuming you have roughly a 90% chance of having cancer. After all, the test is 90% accurate, and you tested positive.

You’d be catastrophically wrong.

Here’s why. To understand what your positive result actually means, you have to factor in the base rate — the underlying likelihood of having cancer in the first place. In a general screened population, roughly 1% of women have breast cancer.

Run the numbers on 100 women:

  • 1 woman has cancer. The test almost certainly finds her. That’s 1 true positive.
  • 99 women are healthy. The test correctly clears 97 of them — but gets 3 wrong. That’s 3 false positives.

Four women test positive in total. Only one of them has cancer.

That positive result you were so alarmed about? It carries a 25% probability of disease — not 90%. Three out of four positive results in this scenario are false alarms.

This is called the Base Rate Fallacy, and it is one of the most important cognitive biases you’ve never heard of. It’s not just a medical curiosity — it shows up everywhere. In hiring, when you over-index on one impressive interview. In relationships, when a single argument feels like evidence of a fatal flaw. In business, when one bad quarter looks like proof the strategy is broken.

The fix is deceptively simple: before you react to a “clear sign,” ask yourself — what was the base rate likelihood of this happening anyway? How often does this kind of thing occur in this kind of context, independent of whatever I’m reacting to?

It’s the mental equivalent of checking whether the smoke alarm is triggered by fire or by toast. The alarm sounds the same. The appropriate response is very different.


4. True Objectivity Is a Myth — and Admitting That Makes You More Scientific

Here’s a claim that tends to make people uncomfortable: there is no such thing as a perfectly objective scientist. Or a perfectly objective journalist. Or a perfectly objective hiring manager. Or a perfectly objective you.

The traditional ideal of scientific reasoning pictures a blank-slate researcher — someone who looks at raw data, sets aside all prior beliefs, and lets the facts speak for themselves. Pure, untainted, neutral. Bayesian inference politely but firmly calls this a myth.

“Bayesianism acknowledges that we never start from a blank slate. There are always underlying judgments and historical evidence.”

This is why Bayesian thinking remains genuinely controversial in some scientific circles. It dares to formalize the role of human judgment — to say, yes, what you already know influences how you interpret new evidence, and we should account for that explicitly, not pretend it away.

But here’s the counterintuitive twist: acknowledging your priors doesn’t make you less rigorous. It makes you more rigorous. Because the moment you admit that your prior beliefs are shaping how you read the evidence, you can actually measure that influence. You can ask: is my prior justified? Is it distorting my reading of the data? How much would the evidence need to shift before I changed my view?

In clinical drug trials, researchers don’t walk in ignorant. They use existing data, meta-analyses, and expert judgment to decide which drugs are worth testing and how to design the study. That’s not bias — that’s wisdom in action. It’s the responsible use of prior knowledge, rather than the pretense of having none.

For the rest of us, this has a powerful everyday application. The next time you’re evaluating something — a new idea, a piece of feedback, a news story — try naming your prior belief explicitly. What do you already think is true? Why? Then ask: how much does this new information actually move the needle?

You’re not eliminating bias by doing this. You’re making it visible. And visible bias is the only kind you can actually manage.


5. Your Brain Is Already Doing This. It’s Just Not Doing It Consciously.

The most quietly astonishing thing about Bayesian reasoning is that your brain isn’t a stranger to it. In fact, your brain might be the most sophisticated Bayesian machine that exists.

Consider what happens when you walk into a dark room. You don’t freeze in existential uncertainty about whether the floor is still there. Your brain — drawing on thousands of previous dark-room experiences — predicts the floor’s presence and acts accordingly. It only updates that model when something unexpected happens: a misplaced shoe, a step that isn’t where you expected it.

This is your “Bayesian brain” at work. It doesn’t rebuild its model of reality from scratch every second — that would be computationally impossible. Instead, it maintains a running model, based on prior expectations, and updates it continuously as new sensory data arrives. Every glance, every sound, every familiar smell is a small piece of evidence feeding into an evolving internal map.

“Fundamentally, Bayesian ideas reflect what it means to be human.”

Our consciousness, on this view, is essentially a machine for reducing uncertainty. We are not passive observers of the world. We are professional revisers — constantly taking in new information, testing it against our existing model, and adjusting.

The gap, for most of us, is that we do this unconsciously in physical space but resist doing it in intellectual and emotional space. Your brain automatically updates its prediction about the floor. It doesn’t automatically update its prediction about whether your colleague dislikes you, or whether your product idea is viable, or whether your original diagnosis of a situation was correct.

That resistance — the gap between automatic physical updating and resistant intellectual updating — is where Bayesian thinking offers its deepest value. The skill isn’t learning to be Bayesian. Your brain already knows how. The skill is learning to bring that same instinct into conscious, deliberate reasoning.


The Bigger Picture: Growth Isn’t About Being Right

The legacy of Thomas Bayes — the ghost clergyman, the unintentional revolutionary — is a remarkably humble one for such a sweeping idea.

Bayesian reasoning doesn’t promise that you’ll arrive at the truth. It promises something more achievable and, in its way, more profound: that you’ll be a little less wrong with every new piece of evidence you encounter. That your model of the world will grow more accurate over time, not through sudden revelation, but through patient, incremental revision.

It’s a framework that rewards curiosity over certainty, and intellectual honesty over intellectual bravado. In a world saturated with noise, hot takes, and the social pressure to have a view and hold it hard, that’s a genuinely radical posture.

The four ideas in this post — that beliefs are probabilities, that base rates matter more than alarming signals, that acknowledging bias is the beginning of rigour, and that your brain is already wired for this — are not just philosophical curiosities. They are practical instruments for better thinking, better decisions, and a better relationship with being wrong.

Which brings us to the question worth sitting with: What belief are you currently holding as a certainty — that, if you’re honest, should probably just be a probability?

Whatever your answer is, that’s where your next update begins.


Enjoyed this? Share it with someone who’s a little too certain about something.

Leave a Reply