Belief Perseverance: Understanding Why We Cling to Our Convictions

  • Article's photo | Credit Chelsea Green Publishing
  • Have you ever tried correcting someone's misinformation, only to find them clinging to their original belief even tighter? In an ideal world, presenting someone with factual information should correct their misconceptions. But the reality often plays out differently. When confronted with evidence that contradicts their deeply held beliefs, some people not only reject the evidence but become even more convinced of their original stance. This psychological phenomenon is known as the Backfire Effect. This post will delve into the details of the backfire effect, exploring its definition, causes, and how to navigate it effectively.

What is Belief Perseverance?

The Backfire Effect is a cognitive bias where individuals strengthen their pre-existing beliefs when confronted with evidence that contradicts them. Instead of accepting the new information and revising their beliefs, they dig in their heels and resist the correction.

This counterintuitive response was first explored by Brendan Nyhan and Jason Reifler in their 2010 study, where they found that attempts to correct false beliefs can sometimes have the opposite effect, reinforcing the erroneous views instead of dispelling them.

The backfire effect highlights a critical obstacle in the process of changing entrenched beliefs. When individuals encounter information that contradicts their deeply held beliefs, instead of reassessing their positions, they may:

  • Reinforce their beliefs: Individuals often find ways to rationalize the contradictory information, reinforcing their original stance.
  • Dismiss the evidence: They might dismiss the new evidence as biased or unreliable.
  • Feel threatened: Challenges to their beliefs can be perceived as personal attacks, leading to defensive responses.

Examples of the Backfire Effect in Action

  1. Political Beliefs

    One of the most prominent areas where the backfire effect is observed is in politics. For example, if a person strongly believes that their preferred political candidate is honest and trustworthy, presenting evidence of the candidate's dishonesty might not only fail to change their mind but might make them support the candidate more fervently. This reaction can be seen as an effort to protect their identity tied to their political beliefs. Instead of altering their view, they might argue that the evidence is a smear campaign or fake news.

  2. Health Misinformation

    The backfire effect is also evident in the realm of health beliefs, such as vaccination. For instance, some parents who believe that vaccines are harmful might encounter scientific evidence that disproves their fears. Instead of accepting the evidence, they might double down on their belief, interpreting the information as part of a conspiracy by pharmaceutical companies. They may then seek out other sources that support their anti-vaccine stance, thereby entrenching their beliefs even further.

  3. Climate Change Denial

    Another example is the denial of climate change. Individuals who believe that climate change is a hoax may become even more convinced of their stance when presented with scientific evidence demonstrating the reality and urgency of the issue. They may perceive the evidence as a threat to their economic or political interests, or to their worldview that opposes regulatory interventions. Consequently, they might reject the evidence as politically motivated or fraudulent.

Key Studies and Evidence

  1. Nyhan and Reifler's 2010 Study

    Nyhan and Reifler's seminal study provided empirical evidence of the Backfire Effect. They conducted experiments where participants were presented with corrective information about political misperceptions. The study found that in some cases, the corrections not only failed to reduce the misperceptions but actually made them stronger among certain groups.

  2. The Myth of the Rational Voter

    Bryan Caplan's work, "The Myth of the Rational Voter," explores how voters often hold irrational beliefs about policy issues. He suggests that when faced with contradictory evidence, people double down on their erroneous beliefs because the cost of changing their mind feels higher than maintaining the status quo, even if it's incorrect.

Factors Influencing the Backfire Effect

Understanding the causes of the backfire effect is crucial for navigating conversations in today's information-saturated world. Here, we'll delve into the key psychological factors that contribute to this cognitive bias:

  1. Worldview Threat: Our beliefs aren't just abstract ideas; they form the foundation of our identity and worldview. When corrective information challenges these core beliefs, it can feel like a personal attack. This triggers a defensive reaction, leading people to reject the information and cling even tighter to their existing beliefs to maintain a sense of coherence in their worldview.
  2. Emotional Investment: The more emotionally invested someone is in a belief, the more likely they are to experience the Backfire Effect. This emotional investment creates a strong incentive to defend their beliefs against contradictory evidence.
  3. Source Credibility: People are more likely to accept information from sources they trust. If corrective information comes from a source perceived as untrustworthy or biased, individuals are more likely to reject it and reinforce their existing beliefs.
  4. Complexity of Information: Simplified or easily digestible corrective information is more likely to be accepted than complex data that requires significant cognitive effort to understand. The complexity of new information can overwhelm individuals, leading them to revert to their familiar beliefs.
  5. Social Influence: The opinions of peers and social groups play a significant role in reinforcing beliefs. If a person's social circle supports a particular viewpoint, they are more likely to resist contradictory evidence to maintain group cohesion and approval.

Overcoming the Backfire Effect

While the backfire effect can be discouraging, it shouldn't stop you from trying to share accurate information. By being aware of this cognitive bias, you can approach conversations more strategically. Here are some tips:

  1. Effective Communication: Tailoring the message to the audience's values and presenting corrective information in a non-confrontational manner can reduce resistance. Using narratives and relatable examples rather than dry statistics can also help make the information more accessible and less threatening.
  2. Building Trust: Establishing trust and credibility with the audience is crucial. This involves being consistent, transparent, and empathetic in communication. People are more likely to consider new information if they trust the source.
  3. Incremental Change: Introducing new information gradually and in small doses can be more effective than overwhelming someone with a lot of contradictory evidence at once. This approach helps reduce cognitive overload and allows individuals to process and integrate new information more comfortably.
  4. Promoting Critical Thinking: Encouraging critical thinking and skepticism can help individuals become more open to reconsidering their beliefs. Teaching skills like evaluating the reliability of sources and identifying biases can empower people to approach new information more objectively.


The Backfire Effect highlights the complexities of human psychology and the challenges of changing deeply held beliefs. While it can be frustrating to encounter resistance to factual information, understanding the underlying mechanisms can help in crafting more effective strategies for communication and education. By building trust, fostering critical thinking, and approaching discussions with empathy and patience, we can create an environment where people are more open to revising their beliefs in light of new evidence.

  • Share
  • Source:
    • Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32(2), 303-330. doi:10.1007/s11109-010-9112-2
    • Flynn, D. J., Nyhan, B., & Reifler, J. (2017). The Nature and Origins of Misperceptions: Understanding False and Unsupported Beliefs About Politics. Advances in Political Psychology, 38(1), 127-150. doi:10.1111/pops.12394
    • Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing Misinformation Through Inoculation: Exposing Misleading Argumentation Techniques Reduces Their Influence. PLoS ONE, 12(5), e0175799. doi:10.1371/journal.pone.0175799
    • Taber, C. S., & Lodge, M. (2006). Motivated Skepticism in the Evaluation of Political Beliefs. American Journal of Political Science, 50(3), 755-769. doi:10.1111/j.1540-5907.2006.00214.x

Recommended Books to Flex Your Knowledge