Cognitive biases and brain biology help explain why facts don’t change minds
- Written by Keith M. Bellizzi, Professor of Human Development and Family Sciences, University of Connecticut
“Facts First[1]” is the tagline of a CNN branding campaign which contends that “once facts are established, opinions can be formed[2].” The problem is that while it sounds logical, this appealing assertion is a fallacy not supported by research.
Cognitive psychology and neuroscience studies have found that the exact opposite is often true when it comes to politics[3]: People form opinions based on emotions, such as fear, contempt and anger, rather than relying on facts. New facts often do not change people’s minds.
I study human development, public health and behavior change[4]. In my work, I see firsthand how hard it is to change someone’s mind and behaviors when they encounter new information that runs counter to their beliefs.
Your worldview, including beliefs and opinions, starts to form during childhood as you’re socialized within a particular cultural context. It gets reinforced over time by the social groups you keep, the media you consume, even how your brain functions. It influences how you think of yourself and how you interact with the world.
For many people, a challenge to their worldview feels like an attack on their personal identity and can cause them to harden their position. Here’s some of the research that explains why it’s natural to resist changing your mind – and how you can get better at making these shifts.
Rejecting what contradicts your beliefs
In an ideal world, rational people who encounter new evidence that contradicts their beliefs would evaluate the facts and change their views accordingly. But that’s generally not how things go in the real world.
Partly to blame is a cognitive bias that can kick in when people encounter evidence that runs counter to their beliefs. Instead of reevaluating what they’ve believed up until now, people tend to reject the incompatible evidence[5]. Psychologists call this phenomenon belief perseverance. Everyone can fall prey to this ingrained way of thinking.
Being presented with facts – whether via the news, social media or one-on-one conversations – that suggest their current beliefs are wrong causes people to feel threatened. This reaction is particularly strong when the beliefs in question are aligned with your political and personal identities. It can feel like an attack on you if one of your strongly held beliefs is challenged.
Confronting facts that don’t line up with your worldview may trigger a “backfire effect[6],” which can end up strengthening your original position and beliefs, particularly with politically charged issues. Researchers have identified this phenomenon in a number of studies, including ones about opinions toward climate change mitigation policies[7] and attitudes toward childhood vaccinations[8].
Focusing on what confirms your beliefs
There’s another cognitive bias that can get in the way of changing your mind, called confirmation bias. It’s the natural tendency to seek out information or interpret things in a way that supports your existing beliefs[9]. Interacting with like-minded people and media[10] reinforces confirmation bias. The problem with confirmation bias is that it can lead to errors in judgment[11] because it keeps you from looking at a situation objectively from multiple angles.
A 2016 Gallup poll provides a great example of this bias. In just one two-week period spanning the 2016 election, both Republicans and Democrats drastically changed their opinions[12] about the state of the economy – in opposite directions.
But nothing was new with the economy. What had changed was that a new political leader from a different party had been elected. The election outcome changed survey respondents’ interpretation of how the economy was doing – a confirmation bias led Republicans to rate it much higher now that their guy would be in charge; Democrats the opposite.
Brain’s hard-wiring doesn’t help
Cognitive biases are predictable patterns in the way people think that can keep you from objectively weighing evidence and changing your mind. Some of the basic ways your brain works can also work against you on this front.
Your brain is hard-wired to protect you – which can lead to reinforcing your opinions and beliefs, even when they’re misguided. Winning a debate or an argument triggers a flood of hormones, including dopamine and adrenaline. In your brain, they contribute to the feeling of pleasure you get during sex, eating, roller-coaster rides – and yes, winning an argument[14]. That rush makes you feel good, maybe even invulnerable. It’s a feeling many people want to have more often.
Moreover, in situations of high stress or distrust, your body releases another hormone, cortisol[15]. It can hijack your advanced thought processes, reason and logic[16] – what psychologists call the executive functions of your brain. Your brain’s amygdala becomes more active, which controls your innate fight-or-flight reaction[17] when you feel under threat.
In the context of communication, people tend to raise their voice, push back and stop listening when these chemicals are coursing through their bodies. Once you’re in that mindset, it’s hard to hear another viewpoint. The desire to be right combined with the brain’s protective mechanisms make it that much harder to change opinions and beliefs, even in the presence of new information.
You can train yourself to keep an open mind
In spite of the cognitive biases and brain biology that make it hard to change minds, there are ways to short-circuit these natural habits.
Work to keep an open mind. Allow yourself to learn new things. Search out perspectives from multiple sides of an issue. Try to form, and modify, your opinions based on evidence that is accurate, objective and verified.
Don’t let yourself be swayed by outliers. For example, give more weight to the numerous doctors and public health officials who describe the preponderance of evidence that vaccines are safe and effective than what you give to one fringe doctor on a podcast who suggests the opposite.
Be wary of repetition, as repeated statements are often perceived as more truthful[18] than new information, no matter how false the claim may be. Social media manipulators and politicians know this all too well.
Presenting things in a nonconfrontational way allows people to evaluate new information without feeling attacked. Insulting others and suggesting someone is ignorant or misinformed, no matter how misguided their beliefs may be, will cause the people you are trying to influence to reject your argument. Instead, try asking questions that lead the person to question what they believe. While opinions may not ultimately change, the chance of success is greater[19].
Recognize we all have these tendencies and respectfully listen to other opinions. Take a deep breath and pause when you feel your body ramping up for a fight. Remember, it’s OK to be wrong at times. Life can be a process of growth.
References
- ^ Facts First (www.cnn.com)
- ^ once facts are established, opinions can be formed (www.cnncreativemarketing.com)
- ^ exact opposite is often true when it comes to politics (doi.org)
- ^ I study human development, public health and behavior change (scholar.google.com)
- ^ reject the incompatible evidence (doi.org)
- ^ backfire effect (doi.org)
- ^ opinions toward climate change mitigation policies (doi.org)
- ^ attitudes toward childhood vaccinations (doi.org)
- ^ supports your existing beliefs (doi.org)
- ^ Interacting with like-minded people and media (doi.org)
- ^ can lead to errors in judgment (doi.org)
- ^ drastically changed their opinions (news.gallup.com)
- ^ Rob Lewine/Tetra images via Getty Images (www.gettyimages.com)
- ^ winning an argument (us.macmillan.com)
- ^ another hormone, cortisol (www.ncbi.nlm.nih.gov)
- ^ hijack your advanced thought processes, reason and logic (doi.org)
- ^ controls your innate fight-or-flight reaction (doi.org)
- ^ perceived as more truthful (doi.org)
- ^ chance of success is greater (doi.org)
Authors: Keith M. Bellizzi, Professor of Human Development and Family Sciences, University of Connecticut