Most topics are ones where either we hardly know anything or where we think know so much that we are very confident.
Funny thing: regardless of the topic of discussion— whether we are uninformed or knowledgeable on it — when someone asks for our view, most of us can’t resist the urge.
Many of us, my past self included, fear that saying “I don’t know” makes us look stupid, or that we’re bad at shutting up.
So we express an opinion regardless.
In this essay, I want to look at: What actually happens when someone states a belief? And when we consequentially disagree?
How we think it works, but it doesn’t
It’s your lucky day and we run into each other and we end up talking about cycling, which, you mention, is a sport you haven’t been following at all because it makes you fall asleep. After a while, you nevertheless say “I think Geraint Thomas will win the Tour de France” or “I believe many cyclists (still) use doping.”
Here’s the standard model of what went down when you made this assertion:
- Person has an opinion on who will win the Tour or about how common cheating is in the peloton. Person’s opinion is there in her head, like a thing you can go and count.
- Some other human asks person about her view on the Tour or the efficacy of anti-doping tests.
- Person reports her pre-formed conviction.
According to this account, the stance you just expressed when you uttered “I think that …” was already there, and by asking you about it, your interlocutor made it come to the surface, solicited it, extracted it from your head.
Research has shown that this piece of folk psychology gets it wrong.
Let’s unpack this step by step.
Phase 1: fear of looking stupid
In the majority of cases, we simply don’t have a worked-out opinion (let’s be honest). We haven’t particularly spent a lot of time meditating on the subject at hand.
When someone asks us about our thoughts, we don’t want to appear dumb so, rather than confessing ignorance, we pretend to have a position anyway.
It’s not the already-present-belief that prompts us to express a belief here, but our dread of losing face when we say we don’t know (or our inability to shut up).
In our little thought experiment, let’s say you haven’t spend any time looking into the question of doping, yet nevertheless at the same time have firm beliefs about cyclist’s ethics: “‘Lance Armstong’ sounds like someone who went to the moon. Oh and they all dope,” you proclaim confidently, adding a scoff: “Everybody knows that.”
This may seem innocent, but it has nasty far-reaching consequences, as we’ll see later.
Phase 2: post-hoc rationalization
Before we get to that, we need to put more pieces of the puzzle in place. If the claim we make isn’t a belief we already had then why do we utter the particular belief we do?
The reason you just gave that opinion isn’t because it’s the position you already held, but has to do with a (a) bias or (b) social factor.
Perhaps you’re British. If someone points out to you how Thomas hasn’t performed at all this season, you can always hide behind how non-fanatically supporting your fellow countryman licenses some naive optimism. Or you and your friends are cynical self-proclaimed “realists” so nothing in this world could make you change your mind about how many cheaters you think the peloton houses (side note I can’t resist: if someone’s stance is unfalsifiable, that says almost everything you need to know about it).
Alternatively, we consult our gut-feelings to ascertain the correct view on a given question. Many of the opinions we have may be the simple result of these hunches.
For example, we often use the availability heuristic, a mental shortcut that relies on immediate examples that come to mind when evaluating a proposition. You feed the words “Tour de France winner” to your mind and the first thing your intuition spits out is “Geraint Thomas”, perhaps with some vague memories of him winning it last year.
Summing up: for most topics, instead of consulting our inner database of pre-formed things-we-think-are-true, we figure out what to say on the spot, not by using causal object-level models of the subject in question but by relying on mechanisms ensuring we say something that’s socially acceptable and true enough.
Phase 3: social consistency
So far, we’ve seen that we care primarily about giving a ‘safe’ answer — non-weird and “yup, sounds plausible” — while at same time keeping up the appearance of being knowledgeable.
Notice the chronological structure of how we come to assert an ‘opinion’. Instead of us reporting some pre-existing belief that was already stored in our head, our opinion is us rationalizing the thing we say as a result of social factors or biases.
And when you’ve been seduced into making an assertion, something crucial has happened. Once we have made a choice or taken a stand, we will encounter personal and interpersonal pressures to behave consistently with that commitment. Those demands will cause us to respond in ways that justify our earlier decision.
That, for better or worse, is more often than we’d like to admit the explanation of why we believe what we believe.
But hold on.
It just seems too much to believe that some judgment we express, not because we had thought the issue but merely because it’s a socially safe thing to say or some quick and dirty heuristic prompted it, is something we feel committed to afterwards.
Why would I end up actually defending some opinion just because I whimsically asserted it?
Because we don’t want to admit that the whimsical expression was just a whimsical expression.
Once we’ve publicly stated our view, even if it was the result of a hardly-rational process like this, social dynamics take care of the rest and ‘lock us in’. That’s why a seemingly innocous assertion has such icky upshots down the road.
The social cost of changing your mind
Once you have said the magical words (asserted the opinion), you are now committed to defending what started as a quick-and-dirty approximation as if it were your considered view for eternity on pain of social ridicule.
Self-contradiction is culturally shameful. We are supposed to be faithful to our opinions. One becomes a traitor otherwise.
Accordingly, people are driven to be consistent in all areas of life — remarks, deeds, attitudes, opinions, beliefs, values, habits, and promises. Once a person makes a decision, she strives to make all future behavior match this past behavior.
As documented in Robert Cialdini’s seminal book Influence, dieters stick with diet programs they’ve paid for, even long after it’s clear they don’t work. College students become loyal to campus societies after they’ve gone through embarrassing hazing. Donors find it difficult to refuse later appeals once they’ve donated to a cause.
Make up your mind about something once, and you never have to think about it again.
According to the theory of ‘cognitive dissonance’, when reality falsifies our deepest beliefs, we rather fiddle with reality than update what we regard as true. If the facts don’t fit our worldview, that’s too bad for the facts. This helps us stick to what we know and avoid the chance of disappointment, embarrassment, failure, and loss.
Learning to admit you’re wrong is the price of intellectual freedom (from your own past actions), but it’s a cost so high that most refuse to pay it.
Why do you believe what you believe?
It is a misunderstanding to think of the self in terms of a set of pre-formed convictions. Rather, many of the opinions we have may simply be the result of socialization and biases followed by social consistency pressures.
Once we blurt out some opinion, interpersonal pressures make us behave consistently with that speech act. Escalation of commitment: we stubbornly hew to an opinion for the mere fact that we have voiced it in the past. Finding pleasure in changing your mind is a rare trait among homo sapiens.
So maybe you should cut yourself and others a bit more slack about your disagreements.
We can’t change human social dynamics, so it might be wise to refrain from expressing an opinion if you actually don’t have one — before you know it, you have to either defend it until you die or do the horrible thing of admitting you were wrong. This is not to be underestimated: many people get married to their ideas all the way to the grave.
Say “I don’t know” more often.
Like to read?
Join my Thinking Together newsletter for a free weekly dose of similarly high-quality mind-expanding ideas.