“These are the facts! Why won’t you believe them?” is one of the more common complaints you hear during an argument. Most of us know the feeling of talking to someone about, let’s say, climate change, gun control, abortion or the death penalty. You think they’re wrong and you have the evidence to prove it, but they just won’t listen to the facts. Why do people do this? Why don’t facts change people’s minds?
To answer this question, we have to ask another one that may seem so self-evident you’ve probably never bother to ask it; what do our brains use information for?
If you said something like “to know what’s true”, or “to get an accurate picture of reality” you are only half-right. Our brain has many reasons it uses information, and “knowing truth” is just one. Others can be “confirming existing beliefs”, “protecting our self-image” and “maintaining group loyalty” (1). I’m going to call these “non-truth goals”; ways our brains use information that don’t need to correspond to the truth.
Psychologists have long been aware that we reason towards goals other than truth, they call it “motivated reasoning” (2), a rarely used term that explains so much of why people stubbornly believe things that aren’t true. So how is motivated reasoning different from regular reasoning?
Most of us know reasoning as a “bottom-up” process. We gather and analyse facts, and use them as evidence to support true statements about reality. This is what we mean when we say ‘follow the facts wherever they lead’. Motivated reasoning however is a top-down process. We start with the psychological goals we want to pursue; “confirming existing beliefs”, “protecting self-image”, “maintaining group loyalty” and then believe whatever information that helps us fulfil these goals. Think of your brain as a courtroom. Regular reasoning is the judge who weighs up all the evidence and tries to make the best decision. Motivated reasoning is more like a lawyer or attorney, it starts with the position it wants to reach and then collects only the information in a way that supports that case. (3)
Motivated reasoning reframes the problem of false belief. Believing falsehoods is not always a case of a truth-machine that has malfunctioned, but one of a complex brain with multiple competing motivations being somehow induced to pursue something apart from truth.
What is this inducement? What flips the switch between our inner-lawyer and our inner-judge? The answer is emotion. If I told you Denmark has an area of 42 924 square kilometres and the atomic number of polonium is 84, you would likely believe me without a second thought as your brain would register almost no emotional response to either statement. Despite this so-called “post-truth” era, most facts still fall into this category. However, if I told you facts about climate change, the death penalty, abortion or gun-control you would have an immediate pang of positive or negative emotion, before you had even had a chance to consciously process the statement itself (4), which would trigger your nefarious inner lawyer. Importantly, this emotional trigger usually occurs outside of conscious awareness, so people can accept and reject facts based on it and still feel like they are being objective.
This is why motivated reasoning is so effective and so frustrating. It doesn’t make us blindly adhere to outright falsehoods without a shred of evidence, that would be too obvious. Instead it uses subtle emotional cues to subconsciously steer us towards the “evidence” that supports what we want to believe. Not only are we convinced of our position, we even have the “evidence” to prove it!
Towards the end of any discussion about psychology, post-truth and polarisation it’s obligatory to offer some vague advice to give the illusion that something can be done. It usually takes the form of “Hey, you know that universal, inbuilt tendency I just explained that makes us process information badly without even realising it? Let’s all agree to do it less”. The problem with pervasive, unconscious, and automatic processes is that they are pervasive, unconscious and automatic.
I could say; next time you are in a conversation with someone who just won’t listen to the facts, remember they aren’t doing it deliberately. Think about the other motivations they, and you, might have for your beliefs and rather than trying to overwhelm them with evidence, try addressing the subtle emotional reasons they might have for a rejecting the information. Does it threaten their identity? Their self-image? And would believing it put them at odds with their friends and family?
That all might work, but I think more practical advice is; next time you are in a conversation with someone who just won’t listen to the facts, don’t hate them for it.
(1) Jost, John & Hennes, E.P. & Lavine, H. (2013). Hot political cognition: Its self-, group, and system-serving purposes. Oxford handbook of social cognition. 851-875.
(2) Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480-498.
(3) Ditto, Pizarro & Tannenbaum (2009) Motivated Moral Reasoning, Psychology of Learning and Motivation, Volume 50, 2009, Pages 307-338
(4) Lodge, Milton and Taber, Charles S., The Rationalizing Voter: Unconscious Thought in Political Information Processing (December 21, 2007).