Confirmation Bias
What Is Confirmation Bias?
The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of a doubt, what is laid before him” (Leo Tolstoy, The Kingdom of God Is Within You [tr. Constance Garnett, 1894]).
Confirmation bias is the tendency of people to interpret, remember, and specifically seek out information that confirms beliefs they already have, and ignore information that disconfirms those beliefs.
This means that when new information confirms someone’s beliefs—or prejudices Opens in new window—they tend to embrace it. On the other hand, when new information challenges something a person believes, they tend to want to reject it.
Confirmation bias is also called “myside bias” because of the way people look only for information that supports their side. Exhibit I shows the types of confirmation bias.
Exhibit I: Types of Confirmation Bias |
---|
|
It is fair to say that it’s been a truth long-understood that people become entrenched in their beliefs and biases to an extent that they deny information that challenges their position.
The Greek historian Thucydides, who lived from 460-395 BCE, recognized this truth long before psychologists did. In The History of the Peloponnesian War, Thucydides wrote:
- “… for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.”
Ibn Khaldun, an Arab historian who lived from 1332-1406, also understood this truth, as he noted in the Muqaddimah, that:
- “… if the soul is infected with partisanship for a particular opinion or sect, it accepts without a moment’s hesitation the information that is agreeable to it.”
Confirmation bias affects our everyday lives in a multitude of ways, from the obvious to the subtle. You’ve probably heard of conspiracy theories before—for example, there’s a group of people who believe that humans haven’t actually landed on the moon. (A surprisingly large group of people believe this, between ten and twenty-five million, according to Phil Plait, who writes about the “Moon Landing Hoax” in his book Bad Astronomy.)
Confirmation bias is a big part of the building and persistence of these conspiracy theories Opens in new window. The people who believe in them seek out information that confirms their conspiracy theory (“You can’t see the stars in the astronaut photos!”) and reject any information that contradicts their belief (“I don’t believe that the photos all had very short exposure times and the stars were too faint because it was daytime on the moon!”).
While this kind of thing can seem funny when it’s people claiming we didn’t land on the moon, it can become a lot more harmful when confirmation bias helps people spread the false claim that vaccines cause autism (they don’t) or the vicious lie that the Sandy Hook School shooting was a hoax.
Confirmation bias in the public has made debating and acting on important issues such as global climate change extremely difficult. Climate change “skepticism” can be confirmed by things as simple as local record high and low temperatures, with people in countries that have had a lot of recent record low temperatures more skeptical that climate change is real.

Demonstration of the Confirmation Bias Effect
In the 1970s, Stanford researchers began conducting experiments to see if Confirmation bias was a real effect. In one experiment, conducted in 1975 by Ross, Lepper, and Hubbard, the researchers showed a group of undergraduate volunteer pairs of different suicide notes.
The subjects of the experiment were told that one of the notes in the pair was a genuine suicide note, and the other had been made up. Then they were asked to identify which of the notes was real, and which was a fake. Some of the subjects were told that they guessed almost every note correctly, and some were told that they consistently failed.
The purpose of the experiment wasn’t to test their detective skills—it was to see how they’d react to being given false positives and negatives. The experimenters then told the students that they’d been deceived to some extent about how right or wrong they’d been—and then were asked to estimate how many “hits” or “misses” they thought they’d gotten.
The results of the self-estimation were curious. Students who had been given a lot of false positives tended to still estimate their “hits” at higher than average—and those who had been given false negatives tended to estimate their “misses” at higher than the real average. A belief in their abilities, given falsely, persisted even after the lie was revealed.
No one is immune to Confirmation bias. It’s just part of how the human mind works. But being aware of its existence and how it can creep into your life is the best defense. A growing feeder of confirmation bias in the modern world is the internet, particularly social media, where it’s easy to surround yourself with a “bubble” of like-minded people.
