The psychological blocks that prevent people from dealing with the evidence in front of them, and what to do about this central problem in our lives.

I have excerpted from an excellent article by Josh Stylman, who continues to amaze me

Best to read the whole thing, but trenchant excerpts are below.

https://stylman.substack.com/p/the-prison-of-certainty?utm_source=substack&utm_campaign=post_embed&utm_medium=web

… In my previous work, I’ve explored how our information landscape is systematically engineered through algorithmic division (‘Engineering Reality‘), institutional narratives (‘Reading Between the Lies’), and the systematic dismissal of pattern recognition (‘That Can’t Be True’). But understanding these external systems is only half the equation. The other half lies within us – the psychological mechanisms that make us resistant to changing our minds even when confronted with overwhelming evidence.

Why Being Wrong Hurts

I was talking with a friend recently about historical events that don’t add up. When I suggested he look at some evidence questioning the official 9/11 narrative, he shut down immediately – not because he’s unintelligent or incurious, but because “he lost a friend that day.” His emotional connection to the event has created a psychological fortress that no evidence can penetrate. Similarly, many who zealously defended COVID policies now acknowledge “mistakes were made” but insist “experts had good intentions.” This isn’t reckoning; it’s rationalization.

An Atlantic article I just came across titled ‘Why the COVID Reckoning Is So One-Sided’ (non-paywalled version) perfectly illustrates this psychological resistance to changing beliefs. Jonathan Chait, the author, smugly criticizes conservatives while demonstrating the very cognitive blindness I’m describing – brushing off liberal “mistakes” as mere good-faith errors rather than the systematic failures that devastated lives. Nowhere does he acknowledge the weaponized censorship that crushed dissent.

This connects directly to what I discussed yesterday about parallel realities. My own experience illustrates this divide – when I spoke against mandates, many in my personal and professional circles couldn’t defend their positions with science or logic. Rather than engage, they simply stopped communicating with me. Now we exist in the separate timelines I described yesterday. I’m not bitter – just genuinely confused by how easily human connections fractured when beliefs were challenged. I have forgiveness in my heart, but I won’t forget how quickly people revealed their true priorities when social conformity conflicted with open inquiry.

These reactions reveal something profound about human psychology: admitting we’ve been manipulated isn’t merely a matter of processing new information. It requires confronting the possibility that our fundamental understanding of reality – and perhaps our very identity – was built on falsehood.

The Cost of Admission

Consider the mRNA vaccines. For parents who rushed to get their children vaccinated, or doctors who enthusiastically promoted them to patients, acknowledging potential harms isn’t simply a matter of updating their risk assessment. It would mean confronting the unbearable possibility that they may have harmed those they love most.

Healthcare workers were prioritized for vaccination, locking them into the narrative early. Once you’ve taken the shot and pushed it on patients, your identity – professional judgment, ethics, self-image as a healer – hinges on its safety. The cost of admitting error becomes psychologically prohibitive.

The cost becomes devastatingly personal. Several friends now take their children to cardiologists for issues that developed after vaccination. Only one has privately confided that he believes the shots caused his child’s condition. For the others, acknowledging this possibility would mean confronting an unbearable guilt – that they may have harmed their child by following what they believed was responsible medical advice.

This explains why some of the most dedicated defenders of these interventions are often healthcare providers who administered them. As psychologist Leon Festinger and his colleagues demonstrated in their landmark 1957 study ‘When Prophecy Fails,’ when evidence contradicts a core belief, many people don’t abandon the belief – they double down on it while dismissing the evidence….

For many educated professionals, their social standing depends on being seen as informed and rational. Admitting they were fundamentally wrong about important matters threatens not just their beliefs but their status. If you’ve built your identity around being “evidence-based” or “following the science,” acknowledging you were misled challenges your core self-concept.

This explains the vehemence with which many defended increasingly incoherent COVID policies. Their fierce attachment wasn’t to the policies themselves but to their self-image as rational followers of expert guidance. Changing their position wasn’t merely a factual update – it meant losing face.

How Our Brains Fight Truth

Research in cognitive neuroscience suggests a compelling insight: our brains process challenges to core beliefs similarly to how they process threats. When presented with evidence contradicting deeply held views, people often experience a physiological stress response—not just intellectual disagreement. Our neural circuitry seems designed to protect our worldview almost as vigilantly as our physical safety.

This explains why presenting facts rarely changes minds on emotionally charged issues. When someone responds to contrary evidence with anger or dismissal, they’re not being stubborn – they’re experiencing a neurological threat response.

Our brains evolved to prioritize social acceptance over objective truth – a survival advantage in tribal settings where rejection could mean death. This creates a fundamental vulnerability: we’re wired to conform to our social group’s beliefs even when evidence suggests they’re wrong.

So how do we overcome wiring this primal?…

Similar Posts