The Ultimate Explanation of the Psychology Behind the Spread of Fake News

Chains

“Crazy fairy tales have become numbingly common,” Vox’s David Roberts recently observed.

We are more apt to be wrong about basic facts than at any time in the recent past.

I have often been left frustrated by someone who subscribes to absurd stuff — about evolution or immigrants or Obama’s birth certificate or you name it — and proceeded to shrugs off any appeal to facts or evidence in blissful ignorance.

I’m picturing his face right now.

And I get angry when I notice ‘discernible reality’ is a stock of faltering value. That there a certain carelessness for reality is on the rise. That norms of accuracy have lost authority. That entire communities remain wholly unswayed by the evidence. That political leaders seem to speak with a blatant disregard for the facts.

That truth no longer matters.

In this essay, I want to investigate this phenomenon.

As usual, the most widely held explanation — that more and more people have lost all interest in evidence and investigation — is wrong.

Believing on the basis of testimony

In our age of hyper-specialization, we believe many things on the basis of claims other people make. Philosophers call this “believing on the basis of testimony”.

When we regard something as true on the basis of testimony, we believe it because it was presented to us by another person. This is usually a normal and, philosophers will add, ‘epistemically virtuous’, practice. There is, for example, nothing irrational about me believing that there are three atoms in a molecule of water — H2O — because my high-school science teacher told me so.

I haven’t detected the atoms myself, yet it’s perfectly fine for me to believe it anyway. On the basis of testimony.

If this sounds too easy, consider that there are benefits in organizing knowledge like this. A community of people with a practice of accepting one another’s affidavit will be able to learn far more than individuals who insist upon believing only what they discover on their own.

But of course, you shouldn’t just uncritically accept what everyone says to you.

  1. This has to do with who is delivering the message and
  2. why she does so. It is wise to, for example, suspend default acceptance of testimony from the person talking to you about how smooth this Volvo rides when she’s selling it to you (or is otherwise incentivized to bend the truth).
  3. Identifying and blocking suspect cases of testimony is also about what is being claimed. Like all sources of proof, it is reasonable to suspend confidence in a piece of testimony if it is radically at odds with what you already know about how the world works. “If a new acquaintance tells me that she saw a squirrel steal a park-goer’s slice of pizza, I’m going to believe her. If she tells me that she saw a squirrel steal a police officer’s handgun and rob a bank, I’m going to require further evidence,” philosopher Regina Rini amusingly explains.

But if the (i) person, her (ii) motivation, and the (iii) content of what she’s presenting — if any of them fail to set off any alarm bells, then we typically accept testimony from others.

Justifiably so.

Until someone starts abusing this practice.

Fast and slow

To understand more about how testimony works, its relationship to fake news, and why it’s usually a rational way to assemble our map of reality, let’s take a detour through cognitive psychology.

In the 1980s, psychologists Richard Petty and John Cacioppo showed that people acquire beliefs by employing one of two distinct cognitive methods. (Their research is not unrelated to Daniel Kahneman’s famous Thinking, Fast and Slow.)

In some cases, we look at a problem through the “central route”. We make a diligent attempt to investigate the facts of a case. If you decide to search for a new stereo by spending weeks of searching on the internet and asking your audiophile friends for advice, you’re using the central route.

In some other cases, as Farhad Manjoo explains in his wonderful book True Enough:

Instead of taking the time to look through a mountain of data, you might simply choose to buy a Volvo because Consumer Reports rates it highly. In that case, you’d be employing the other cognitive pathway, what Petty and Cacioppo call the peripheral route. Here, we use “cues” — like emotional reactions or what an expert or a celebrity or some other trustworthy figure thinks — to guide us towards a decision.

Accordingly, I might choose a Volkswagen because my friends have one too, or a Hummer because Arnold Schwarzenegger has several, or a Tesla because I like weed.

Remember: peripheral processing → reliance on cues. That’s important.

Rationally taking the shortcut

When the volume of information increases, this compromises our ability to filter the relevant from the irrelevant. In the absence of expert comment, then, we find ourselves drowning in a sea of facts divorced of meaning, trying to keep afloat in all the numbers.

It’s perfectly rational, then, to let the peripheral route take over when stuff gets too complex.

Why wouldn’t it be?

When we simply don’t — can’t — understand the details of the case at hand, when the amount of data and -isms overwhelm us, or when we just don’t care enough, what’s blameworthy about believing someone who looks like an expert, talks like an expert and says things that kind of fit with how we think the world works?

Consumer Reports has the resources to test every car on the market, and you do not. So if the magazine says that Volvo’s the way to go, you listen. This isn’t always a bad strategy. After all, how often is Consumer Reports wrong about some product endorsement? Rarely, or else it wouldn’t be as vaunted as it is. — Farhad Manjoo, True Enough

The whole point of using the peripheral route is to have a complete map of reality that also answers questions that would take too much effort for us to evaluate all the facts ourselves.

Do you know how many papers are there on climate change? A lot. Must you have looked at a big portion of them to wield a justified opinion during a watercooler chat? Hope not!

Why does this matter? It already suggests that a certain reluctance to fact-check everything we hear can’t be what explains why “crazy fairy tales have become numbingly common”. Peripheral processing has been our best friend for many years. Shouting to “start thinking for ourselves” isn’t the solution; that kind of radical intellectual autonomy is a pipedream.

Something more complex than laziness or irrationality on the side of individuals must be going on.

Complexity

On to part two of our little detour.

What determines when people use central or peripheral processing?

Generally, you use the peripheral route when you’re trying to save time or are being confronted with queries beyond your capacity to solve.

Economist Tyler Cowen estimates that the single biggest recent change in Western life has been the dramatic decline in the cost and inconvenience of getting information. All this data is empowering, surely. It gives us a peek into fields where only experts once dared to tread. There’s no need for you to blindly trust your chemistry teacher’s claims about water or your local pastor’s assertions about the age of the earth or the UN’s theories on climate change.

If you want, you could always switch to the central route.

To find flaws in such analyses, that’s your only choice. You must dig into the substance of the positions, learn about chemistry/geology/biology, and investigate yourself how their predictions compare against actual data. Only if you do all of that some of these arguments may fall flat (or not).

But who actually does all that?

No one, of course. We take the peripheral route. That means, as Manjoo explained: we rely on cues (remember?).

To recap, when we process something peripherally, we rely on cues to assess it. We take the peripheral route when there’s too much data or the data is too complex. That’s exactly the situation we find ourselves in today.

And that enormous consequences.

Dependency and vulnerability

This is where our detour ends, the crucial role of testimony re-enters the picture and meets our reliance on cues.

As the controversies that dominate our lives become ever more complex, as arcane information from outside our own expertise overwhelms the public discourse, the world is increasingly rendered comprehensible only through the eyes of the expert.

As modern knowledge depends on long chains of experts, we have no choice but to trust each other. In an age of hyper-specialization, the weight of testimony — the claims other people make — as source of evidence and justification has greatly increased. This makes us vulnerable: since the relevant facts are beyond us, experts can disguise a lack of factual support for some “truth” as a matter of expertise.

“You wouldn’t understand.”

If you examined your most cherished thoughts about the world, you’re likely to find some expert or other lurking down underneath. What’s the basis, for instance, of your thoughts on the threat of global warming or the business prospects of Apple? Or, indeed, of your conviction that a water molecule has three atoms?

As philosopher Elijah Millgram argues in The Great Endarkenment, modern knowledge depends on trusting long cables of experts. And no single person is in the position to check up on the reliability of every member of that chain. Ask yourself: could you tell a good statistician from an incompetent one? A good biologist from a bad one? A good nuclear engineer, or radiologist, or macro-economist, from a bad one?

This vulnerability and the need for trust can, in turn, be exploited. The particular cues we pay attention to may not be the ones that best guide us toward the truth. Peripheral processing all but invites us to introduce biases in which expert we trust.

In the rest of this essay, I’ll try to show that people who seem to have lost all interest in facts are not irrational but instead misinformed about where to place their trust. Their peripheral processing strategy is being gamed.

How to know where to place your trust?

So we are not in a position to check on every member in our chain of experts. Our alternative diagnosis of the psychology behind the spread of fake news takes its cue from this crucial follow-up question that situation raises: Can we at least know who (not) to listen to?

If we can’t deduce who is right, can we at least determine who is likely to be right; who is trustworthy?

Recall the three conditions for accepting other people’s claims: (i) who she is (likely to be trusted or not), (ii) what her motivations are (is she just praising Volvo because she wants to sell me this ride?) and (iii) whether she says things that are more or less in unison with your current map of reality (no, this car can’t fly).

People are trying to game these judgments.

They want to earn your trust despite not deserving it. Make you believe what they say, even though their statement violates one of the three norms of accepting testimony. So they make it look as if what they say has the right (i) who, (ii) why and (iii) what, even though it hasn’t.

And, unfortunately, they are quite good at this. As a consequence, ordinary practices for deciding who is (not) trustworthy now mislead us.

Rather than people suddenly losing interest in facts, that’s the change that explains why “crazy fairy tales have become numbingly common.”

For example, big-number data seem trustworthy but they offer a classic path to the peripheral route. To nonexperts, numbers can easily be made to look freighted with meaning, when in truth they may signal nothing out of the ordinary. In Is Most Published Research Wrong?Veritasiumpersuasively explains, following a notorious 2005 paper by a Stanford physicist, Why Most Published Research Findings Are False.

Lots of numbers? White coat? Journal reference? Not too wild of a claim? He’s not selling me something? The norms for accepting testimony are respected, and the peripheral cues give the green light, so we now regard something as true because this person told us so.

Congratulations: you have now been tricked into believing something that’s likely false.

That was just one example. The spread of fake news follows a similar model.

Exploiting a reasonable practice (or, the tragedy of the epistemic commons)

More than ever, we have no choice but to believe many things on the basis of (expert) testimony. We typically accept testimony from others, all else equal. Especially when peripheral cues give the green light.

However, as we just saw, it’s also easier than ever to abuse the current conditions and mine vulnerabilities of the many-media, many-experts world to sneak past our defenses.

People who put on the garb of experts and abuse our trust exploit gaps in otherwise reasonable norms of information processing.

They operate as a kind of social parasite on our unavoidable vulnerability, taking advantage of our epistemic condition and social dependency.

The psychology behind the spread of fake news

This reveals something about the people we started out with — those who seem to believe crazy things such as Obama being a secret Muslim or Pizzagate. Those who, it seems, have lost all interest in evidence or investigation, and have fallen away from the ways of reason (a philosopher might say).

The upshot of this piece is that that appearance is, in fact, misleading.

As we’ve seen repeatedly, experts and trust and testimony and chains play an ineliminable role in deciding what to believe as true. Ultimately, that’s because, in deciding what to regard as true, we can’t begin from nothing but have to start by assuming something and trusting somebody.

Imagine, for instance, as a thought experiment, someone being entirely brought up in a Deep State echo-chamber.

Our child has been taught the beliefs of the echo-chamber, and to trust only the TV channels and experts who agree with the Deep State views of her tribe. This makes sense. It’s the same way as we wouldn’t accept someone’s claims about flying cars or a squirrel stealing a police officer’s handgun and rob a bank — it’s just too radically at odds with what we already regard as true about the world. She is in the exact analogous manner suspending acceptance of a claim, after having assessed the trustworthiness of each source, using her own background beliefs.

Her (to us) apparent ‘post-truth’ attitude can thus be explained as the result of manipulations of trust resulting in peculiar background beliefs.

Her earnest attempts at intellectual investigation are led astray by her upbringing and the social structure in which she is embedded. Because of her Deep State ideology, she will reject all plausible theories about what happened on 9/11 or to John F. Kennedy as false and take any evidence we throw at her as untrustworthy. These Deep State beliefs have infected outwards, infesting her whole belief system and bases for evaluation.

That’s why we can ascribe her tendency to buy fake news to ‘being in the wrong chain’ rather than personal vices.

According to this ‘bottom-up’ explanation, then, “unreasonable” people have not stopped caring about reality and truth and evidence. It’s simply that their basis for evaluation — their background beliefs about whom to trust — are off from the start. They are not irrational, but systematically misinformed about where to place their trust.

To explain why people believe crazy things, there’s no need to add hypotheses about them being dumb or stupid or foolish or sheep.

We don’t have to cook up a complete disinterest in facts, evidence or reason to explain the increased prevalence of seemingly absurd ideas. We simply have to attribute to certain communities a vastly divergent set of trusted authorities, combine it with the cognitive science of central and peripheral processing, recall the unavoidability of trust and believing on the basis of testimony in forming knowledge, and notice the widespread manipulation of this vulnerability.

I leave you with this assignment from philosopher Thi Nguyen:

Listen to what it actually sounds like when people reject the plain facts — it doesn’t sound like brute irrationality.

One side points out a piece of economic data; the other side rejects that data by rejecting its source. They think that newspaper is biased, or the academic elites generating the data are corrupt. An echo chamber doesn’t destroy their members’ interest in the truth; it merely manipulates whom they trust and changes whom they accept as trustworthy sources and institutions.


Like to read?

Join my Thinking Together newsletter for a free weekly dose of similarly high-quality mind-expanding ideas.

Spread the love