What Is Counterfactual Thinking?
Our understanding of an event is influenced not only by what actually happened, but also by what “coulda, woulda, or shoulda” happened. These thoughts about how the past might have happened differently are known as counterfactuals—mentally imagining alternatives to past or present factual circumstances.
Counterfactual means “contrary to the facts.” Counterfactual thinking refers to reconstructive thoughts about a past event, in which antecedents to the event are mentally mutated and possible changes to the outcomes are contemplated (Kahneman and Traversky 1982).
Cognitive and social psychologists are interested in how lay perceivers use counterfactual thinking in everyday life.
They aim to understand both when counterfactual thinking normally occurs and which counterfactual constructions of reality, from the infinite number of possible ones, are most likely to be generated by the average person.
Douglas Hofstandter, cognitive science professor at Indiana University and author of the Pulitzer Prize-winning Godel, Escher, Bach: An Eternal Golden Braid, wrote, “Think how immeasurably poorer our mental lives would be if we didn’t have this creative capacity for slipping out of the midst of reality into soft ‘what ifs’!” (Hofstandter, 1979).
Counterfactual thinking is prevalent in domains of ordinary personal life such as career and romance, after traumatic life experiences such as bereavement, and in public life as observed during public inquiries and court cases.
Studies have found that counterfactual thinking is involved in a variety of psychological processes, including attributions of blame and responsibility, perceptions of fairness, and feelings of guilt and shame.
The counterfactual thoughts of offenders, defendants, or prisoners are likely to center on issues of blame and fairness, and feelings of guilt and shame, much like victims, criminal justice agents, the media, and public focus on these issues when considering crime, justice, and punishment.
Social psychologists predicted that prisoners engaged in counterfactual thinking about how they might have prevented the events leading up to their imprisonment would assign more blame to themselves than prisoners who engaged in thoughts about how they actually brought about those events. They assume counterfactual thinking can identify a broader range of blame-relevant factors than a factual analysis of causes (Davis et al. 1996).
One recent study on counterfactual thinking is directly relevant to students because it involves test-taking strategies (Krueger, Wirtz, & Miller, 2005).
When taking multiple choice tests, many students initially think that one of the answers is correct, and they choose it. After thinking about it more, however, they begin to doubt their so-called first instinct and think that another answer is even better.
Are students better off going with their first answer, or should they switch their answer? About 75% of students think it is better to stick with their initial answer.
Some test preparation guides also give the same advice: “Exercise great caution if you decide to change your answer. Experience indicates that many students who change answers change to the wrong answer” (Kaplan, 1999, p. 3.7). However, virtually all studies show that students are better off switching answers.
Krueger and his colleagues have dubbed this tendency the first instinct fallacy, defined as the false belief that it is better not change one’s first answer even if one starts to think a different answer is correct.
So why do many students, professors, and test guide writers succumb to this fallacy? Research on counterfactual thinking can shed more light on this issue.
Suppose you did get the answer wrong in the end and therefore engaged in counterfactual thinking about what you might have done to get it right.
You’d probably feel the most regret if you had first written down the correct answer and then changed it to a wrong one. You’d feel less regret if you had first written the wrong answer and then refused to change it, because in that scenario you had never put down the right answer. Having first written the correct answer and then erased it makes you feel that you were so close to getting it correct that changing was a terrible mistake.
Counterfactual thinking can envision outcomes that were either better or worse than what actually happened. Thus, counterfactual thinking consists in upward counterfactuals—imagining alternatives that better than actuality, and downward counterfactuals—imagining alternatives that are worse than actuality. Both upward counterfactuals and downward counterfactual are discussed at length in designated entries.
People make far more upward than downward counterfactuals, which is probably a good thing because it causes people to consider how to make things better in the future (Roese & Olson, 1997). For example, if Eduardo looks back on his exam and regrets not studying harder so he could have earned a higher grade, he will probably study harder next time.
Downward counterfactuals have their uses too. They particularly help people feel better in the aftermath of misfortune. When something bad happens, people say, “It could have been worse,” and contemplating those even more terrible counterfactuals is comforting.
The two concepts are related, but they are not the same thing (Gilovich & Medvec, 1995). One important difference is that regrets are feelings, whereas counterfactuals are thoughts. Regret involves feeling sorry for misfortunes, limitations, losses, transgressions, shortcomings, or mistakes (Landman, 1993).
Ultimately, counterfactual thinking is probably one of the crucial traits that has helped people create and sustain the marvels of human society and culture. Most animals can barely perceive and understand the world as it is, but we can dream of how it can be different. Democracy, women’s liberation, and wireless technology did not exist in nature, but human beings were able to look at life as it was and imagine how it could be different, and these imaginings helped them change the world for the better.