The Fallacy of Origins and How to Be Less Wrong

Bubble

Chris has grown up in an extremely religious family and as a result he grows a strong faith in God as well. Katie’s parents are politically liberal and she has been exposed to many of their arguments. As a result, she shares a lot of typically liberal beliefs. Caleb comes from southern USA and develops the firm conviction that gun ownership should be legal.


Why did Daenarys burn King’s Landing?

“She did it because she totally lost her mind!”

“She did it because she thought it was the right thing to do as it would liberate the city.”

Writing this before the final Game of Thrones episode comes out, I don’t know which one’s correct. But I want you to notice something about the form of these answers to my ambiguous why-question.

The first response gives a causal explanation: someone or something woke the dragon, as her late brother liked to say, this made a switch flip in her mind, and as things went dark inside her had the city was lit on fire.

The second line gives a justifying explanation. It doesn’t answer the why question causally — by telling us how it happened — but explains ‘why’ she did it by citing considerations that, for the Dragon Queen, counted in favor of acting like she did. We might call these second types of responses justifying reasons, as they are the kind of reasons Khaleesi might cite in justifying her behavior.

See the difference?

In Dany’s case, it seems, either the one or the other provides the truthful pedigree of her action. If she did it because she thought it was right, it’s unlikely her behavior was inspired by a blackout.

However, these two types of explanation needn’t be exclusive like that.

Consider the question:

Why did you kiss her?

I can answer:

  1. “Because I was drunk” (causal explanation) or
  2. “Because I’m attracted to her” (justifying explanation).

Here they could both apply. For example, I might actually like her but be a bit shy. The fact that I had alcohol in my blood when I was eager to exchange saliva does nothing to render my classroom-daydreamings fake.

Remember the examples at the beginning? Now that we’ve pumped some intuitions, it’s time to switch gears from behavior to beliefs, truthand being less wrong.

How about the question:

Why do you believe [atheism is true/democracy is good/science is more likely than voodoo to yield accurate information?]

What are the causal and justifying answers to the respective why-do-you-believe-X questions here?


“You only believe that because…”

The move from (i) the shocking discovery that you hold a belief because your life went a certain way to the conclusion that (ii) you’re therefore biased in having this conviction is very persistent these days:

“Nietzsche thought that unawareness of often unconscious motivators leads us to make vulgar and baseless moral claims. This insight is frequently misused today by a host of commentators who substitute pop-psychologism for real analysis of their opponent’s position.” — Matt McManus, Why We Should Read Nietzsche

“You just believe atheism is true because your parents are.”

“You just believe democracy is the best system because it has worked well for an upper-middle-class white male like you.”

“You just believe science is more likely to be accurate than voodoo because you’re Western.”

It’s probable that a big causal factor in me believing atheism to be true, is the way someone raised me (and where). But just because my take would have been different would I have been born in a religious family, doesn’t mean my grounds for believing atheism to be true are thereby invalidated.

If you recall that causal explanations (I was drunk) needn’t necessarily undermine justifying explanations (I like her) you’ll see why. Even though I was brought up a certain way, I can still give you reasons that might justify me in supposing that there’s no Transcendent Being.

The correct metaphysical story about the universe obviously has nothing to do with my upbringing.

According to what philosophers call the genetic fallacy or the fallacy of origins, it always undermines the justification of a belief to find out it has a certain origin. This is a fallacy, because all beliefs have a certain causal origin and not all your beliefs are unjustified.

Analyzing my socio-cultural location is one thing. Figuring out what theory is best-supported is another.

While some such genealogical story might remove some initial justificatory force of your idiosyncratic intuition for some position, it doesn’t make the arguments for said theory somehow less valid or sound.

Blame the Amortentia

On the hand, these points inspired by the genetic fallacy ring true to me.

On the other hand, they might be too quick.

Philosophers like Marx, Freud, and Nietzsche have debunked views in philosophy by showing there’s something wrong with their causal history (for example, Nietzsche explains the source of our moral beliefs in terms of feelings of resentment). These arguments have value because one way to recognize that a view is unjustified is to see how it arose through a causal process we think is unreliable.

We can sometimes learn useful things about justification by looking at causal explanations. The genetic fallacy is not always a fallacy.

We’ve seen that to immediately assess the truth value of a belief on the grounds of its roots is erroneous. However, the truth or falsity of a belief can sometimes be determined by the belief’s origins. If I’m Harry Potter, and Romilda Vane has a crush on me and puts Amortentia — a love potion — in my butterbeer, after which I go on to sing her a serenade, then Ron and Hermoine — and Ginny, of course — are perfectly justified in asserting that my belief that I love her is unjustified just by finding out its origins.

Returning to the issue at hand: there remains something unsettling about the realization that one’s convictions can be traced back to an arbitrary factor of where one was born.

And there is still something itchy — resistance-provoking — to a critical probing of how information concerning the cause of your belief might constitute a further reason to change your opinion on the matter.

The fact that my judgment results from having a particularly cultivated psyche doesn’t tell us whether it is true or false. That said, what seems true to me, or the premises I reason from, do not bear the seal of validity upon their sleeves. I cannot afford to ignore whatever evidence about why I might ascent to, or find plausible, certain propositions rather than others.

When does knowledge of the genealogy of a belief undermine that opinion?

Believing differently, and thinking you’re right to

It is uncontroversial that at least many of our beliefs are influenced by factors that are irrelevant to the actual proposition in question.

If I would have been born in, for example, Yemen, I would probably have very divergent opinions on what the correct metaphysical theory about the universe is (and the same seems to go for my convictions about democracy).

But it’s not just that I would have held contrasting views.

The key driving worry in such cases is that had you been in the other situation, for better or worse, you would have thought you were right in your beliefs. You would cite various reasons for your different conviction and it would seem to you that these considerations were rationally persuasive.

Clearly, you can’t be accurate in both cases. Atheism and theism (of whatever form) can’t both be correct about the (non-)existence of a Transcendent Being.

But of course that is just our predicament any time we believe anything. Whatever I think, there is always the possibility of my having believed otherwise. And had I done so I would no doubt think that I was right.

And that, crucially, would be a bad result. If an argument aimed at a more limited target turns out to entail that we are not justified in believing much of anything then this shows that we are imposing implausibly strong constraints on justification.

This raises the question: is there something distinctive about the doubts raised by considerations of what led us to believe as we do? Is there is a stable position of doubt about those beliefs we were nurtured with? Or do these genetic-fallacy-ish arguments prove too much and collapse into a radical skepticism?

Where justifications end

In forming beliefs, we have intuitions of truth and falsity, good and bad, logical consistency, and causality that are foundational to our thinking about anything. Epistemic axioms inspired by them determine what we find reasonable at every stage of analysis.

The fact is that all forms of inquiry pull themselves up by some intuitive bootstraps. Gödel proved this for arithmetic, and it seems intuitively obvious for other forms of reasoning as well. — Sam Harriss

At some point, you can’t get beyond these ‘intuitive bootstraps’, your fundamental standards.

Every time we consider our most fundamental standards we feel that they require some kind of endorsement from the outside. But we quickly run out of places to stand. We can’t step outside of all reasoning, as it were, to assess whether any of our reasoning is any good. — Roger White

Some examples: since science presupposes the lawfulness of nature it cannot non-circularly provide evidence for the intuition that there is an intelligible regularity underlying reality. It also can’t use logic to validate logic. It presupposes the value of logic from the start. And physics can only inductively justify the intellectual tools one needs to do physics. Likewise, from a skeptical point of view, it is not obvious what non-perceptual grounds we could have to suppose that our perceptual faculties are reliable.

Unless one has reason to doubt that one is meeting his own fundamental epistemic standards, then, the problem raised by cases of upbringing is justthe general skeptical worry that we have no independent support for the correctness of those requirements.

So that can’t be the issue.

The real problem lies in our apparent inability to tell whether the factors determining where we get socialized and attend school and hear about rationality and science and democracy and our standards and how to reason from them gave us the right fundamental premises or ‘intuitive bootstraps’ to arrive at the truth.

When I draw some conclusion from such assumptions, whether by application of logic, statistical inference, or on the basis of the explanatory virtues of a theory, or what have you, my evidence consists in the premises from which I reasoned.

I’m justified by virtue of having taken rational steps from premises I had reason to accept.

For instance, Buddhists scientists now argue that non-material presuppositions explain the universe just as well as the materialistic postulates of modern science. And since the competing sets of premises have equal explanatory power, arriving at an inherently conscious worldview is no less valid than any of our constructs of material reality. Therefore the premise that the essence of the Universe is consciousness is just as valid as a premise than stipulating that the essence of the universe is matter.

I have no idea who’s right, but the fact that there are apparently very smart, well-informed thinkers who differ in their opinion because they reasoned correctly from different premises they claim to be equally justified is, perhaps a bit worrying, but above all super exciting.

Values from facts?

Remember the three examples from the beginning? (Atheism, science, democracy.)

I’ve argued that to upgrade your knowledge and disagree constructively, it’s better to focus on the quality of the arguments rather than the source of the evidence.

I now want to finish by lining out why I think the cases of science, religion and the fundamental nature of the universe, on the one hand, are different from the case of democracy, on the other.

In the scientific case, the theistic and the atheistic inquirer disagree on empirical conclusions. Their first premises diverge descriptively, and accordingly, they end up with incompatible conclusions about what the fundamental nature of the universe is.

In the case of “You just believe democracy is the best way to run a country because…”, however, things are a little different. The conclusion is a normative one, not an empirical one. The statement is not about what is, butdeals with how a country ought to be led.

That’s the disanalogy.

We cannot infer ought-conclusions from is-premises. This is known as ‘Hume’s Law’ in philosophy. The ought-to-be-doneness of democracy, for example, can’t be derived from descriptive facts without a normative intermediary such as ‘it ought to be the case that citizens can exercise political power’ (becausethey’re equal as human beings).

The only explanation of clashing prescriptive verdicts, then, notes that the arguing parties adhere to contrasting fundamental normative axioms to ‘link’ ‘is’ and ought’. Would Genghis Khan have been my father, I probably would not have agreed with the intermediary assumption that led me to value democracy, while it has been considered self-evident in post-Enlightenment Western tradition.

Are my beliefs about these issues undermined by their particular causal origin? Is there a way to satisfy the collective hunger for “unbiased views”?

Wrong question.

We’re not doing science here. We’re not in the business of figuring out truths about reality. Our practical goal of figuring out how to act is not best served by requiring ascension to a “view from nowhere”.

We are disagreeing not about what the world is like, but about what to allow or permit, or how to react, or what to do, or what to admire or condemn.

So rather than feigning neutrality, it’s better to be transparent about your assumptions and convictions — about where you’re coming from — than to pretend you don’t have any.

We could all start by putting our skin in the game, by being honest about our biases and tribal affiliations. We could abandon the pretense to neutrality and more honestly engage with each other, allowing arguments to go to our value structure and philosophical foundations more quickly.

Perhaps there’s a slight chance we’ll get beyond the mud-throwing this time.


Like to read?

Join my Thinking Together newsletter for a free weekly dose of similarly high-quality mind-expanding ideas.

Spread the love