In the opening chapter of his book, Beyond Good and Evil, Friedrich Nietzsche argues that philosophers have always had a strange, pathological obsession with “truth.” Truth is seen as the greatest good in the universe, and, if we believe Socrates, all the bad and evil in the world stems from ignorance of this truth. And so, libraries of books have been devoted to “What is truth?”, “How to know the truth?”, and “What is and isn’t true.”

But what if most people don’t actually want the truth, or if they just want to be right? In such cases, the truth might be a liability. When what philosophers, scientists, or experts present as “true” is something that makes someone wrong, their minds will do something odd — they will lock down. And according to the philosopher Chris Ranalli, when this happens, we should call it “indoctrination.”

So, what is indoctrination, and how can we fix it?

The cage you build yourself

In his work on social epistemology, Ranalli argues that indoctrination isn’t just about what you believe, but about how that belief is sealed off from the rest of the world. What Ranalli calls “epistemically insulating content” is any belief that comes prepackaged with the instruction that seriously questioning it is either irrational or immoral. It’s a psychological cage — a cage where the door in and out stays barred.

Ranalli points out that indoctrination is when someone preemptively dismisses any counterevidence that may come along. In an epistemically neutral situation, someone might look at the evidence that arrives, weigh it up, and accept or reject it before you can say “Bayesian analysis.” But when you’re indoctrinated, you’ve already decided, before the evidence shows up, that it cannot possibly be valid. If somebody offers a contrary opinion — even very gently or reasonably — you don’t see it as a reason to examine your own belief. Instead, you’re trained to view it as a “test of faith,” or as “propaganda” from an enemy, or as proof that the other side is deluded. As Ranalli put it, indoctrination has a kind of “defence mechanism where any contrary view is considered to be irrational, or even immoral.”

This is why arguing with the indoctrinated feels like talking to a wall. You’re not dealing with a person who has weighed the evidence and come to a different conclusion. You’re dealing with a person whose entire belief system has been immunized against the possibility of being wrong. They see counter-evidence as an enemy. Temptation works in many ways, and it might come bearing a peer-reviewed scientific paper.

In a 2022 paper titled “Closed-minded Belief and Indoctrination,” Ranalli argues that the indoctrinated person is not necessarily any less intelligent than anybody else. They can be sharp, articulate, and well-educated. They might have read every book available on the topic. But the defining difference is that the indoctrinated are constantly at war. They’ve come to see doubt as a danger. And this is why indoctrination is not about any specific kind of belief but rather the resistance to revising that belief — in Ranalli’s words, “indoctrination can occur for liberal democratic beliefs as much as it occurs for fascist, fundamentalist, or fanatical belief.”

The off-ramp

So what do we do about it?

Whenever we encounter the truly indoctrinated, most of us have this antagonistic urge to challenge them. We marshal our facts, line up our arguments, and charge at their battened-down fortress. We think that if we just present enough evidence, if we’re just persuasive enough, we can batter the gates open.

But this is precisely the wrong approach. Besieging someone only catalyzes those defense mechanisms. Every counterargument you throw at them becomes further proof, in their mind, that the world is hostile and their beliefs need protecting.

Instead, we need to approach the indoctrinated with patience and a good deal more generosity. He argues that we need to offer people an off-ramp. And so, the key to inviting the indoctrinated mind to open its gates is to make the person feel safe and secure enough to doubt.

Doubting is scary. Doubt makes you vulnerable. If you’ve built your entire sense of self around a set of beliefs, then questioning those beliefs is a kind of self-destruction. This is why so many people cling to beliefs that, from the outside, seem obviously flawed. It’s not that they can’t see the cracks, but rather that acknowledging the cracks is terrifying. And if the only people pointing out those cracks are doing so with contempt, with mockery, and with the smug satisfaction of being right, then it makes perfect sense to meet that with resilience.

This requires a kind of epistemic compassion. We need to create spaces where doubting is seen as okay—where changing your mind isn’t treated as weakness or betrayal, but as something brave. Where the person on the other side of the argument isn’t an enemy to be defeated but a fellow human being who might just need permission and space to think differently.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *