I don’t honestly know what you mean.

Curiously, what you said just now goes to the core of what I’m saying.

Which is?

That we don’t always know what we don’t know. In fact, your candour in acknowledging the limits of your knowledge is somewhat out of character, according to cognitive scientists.

What does it have to do with the citizenship controversy?

I was watching some television footage of interviews with some of those who were protesting in Delhi against the recent Citizenship (Amendment) Act — and it was quite revelatory.

In what sense?

The interviewers asked some of the stone-throwing protestors what they were protesting against, and strikingly, not many of them seemed to know clearly. A few of them threw around buzzwords such as CAB (the Citizenship Amendment Bill), CAA (the Act of the same name) and NRC (the National Register of Citizens), but for the most part they sounded completely clueless.

How do you account for that?

Economists and cognitive scientists reason that humans are susceptible to what’s called the ‘illusion of explanatory depth’. In an influential 2002 research paper, The misunderstood limits of folk science: An illusion of explanatory depth , Yale psychologists Leonid Rozenblit and Frank Keil theorised that “most people feel they understand the world with far greater detail, coherence, and depth than they really do.” This illusion applied even to their perceptions about how everyday gadgets work. Similarly, in 2006, Rebecca Lawson at the University of Liverpool established that most people’s conceptual understanding of familiar objects — such as a bicycle — is sketchy and shallow. And when it comes to an understanding of political issues, the illusion of confidence is even more accentuated.

Tell me more.

In his Sceptical Essays , Bertrand Russell noted that “the opinions that are held with passion are always those for which no good ground exists; indeed, passion is the measure of the holders’ lack of rational conviction.” The political landscape in India today seems to bear that out in abundant measure.

So where does this illusion spring from?

As Steven Sloman, professor of cognitive, linguistic, and psychological sciences at Brown University (and co-author, along with Philip Fernbach, of The Knowledge Illusion: Why we never think alone ) has noted, people typically fail to distinguish what they know from what others know. Additionally, in matters of politics, most of us tend to hold the beliefs that we do because the people around us hold those beliefs.

Can this ‘filter bubble’ be burst by providing more information?

It’s tricky, reasons Stanford economist Matthew Jackson. Indicatively, as part of an experiment on people’s understanding of climate change, he had his subjects read a set of abstracts from scientific articles. He established that people looking at the same article would interpret it very differently, depending on what their initial position on climate change was. This was an example of ‘confirmation bias’ at work.

How can this illusion be neutralised then?

In a 2013 paper, titled Political extremism is supported by an illusion of understanding , Fernbach at the Leeds School of Business, Sloman and their co-authors concluded that while “several psychological factors increase extremism”, merely getting people to explain how a policy works would make them aware of how poorly they understood the policy. Such “explanation generation” promises to be an effective “debiasing procedure,” they noted.

Bottomline?

In the Indian context, it sounds very unrealistic, but exposing Mr Know-Alls to the limits of their understanding is the best ‘cure’ for ignorance.

A weekly column that helps you ask the right questions

comment COMMENT NOW