Understanding Heuristics: Mental Shortcuts
In our daily encounters, we are confronted with decisions almost constantly that it is impossible to think about and process all of the relevant information in a careful and deliberate manner—reason being that deliberate conscious thinking is difficult and requires effort. Because we cannot afford to expend large amounts of time or energy making every judgment required during the day, we rely on so called “heuristics” or simplifying strategies to make reasonable guesses.
Heuristics—otherwise called rules of thumb—are time-saving mental short cuts (almost) everyone uses to speed up judgments. They are quick and easy, yet they are also where things potentially go wrong as they often result in biases skewing our judgment.
Usually the increased speed of decision making outweighs the loss in decision quality. However, people do not consciously make this trade-off between decision quality and speed because they are typically unaware that they are using a heuristic instead of more time-consuming, but more accurate, strategies.
The term heuristic is derived from Greek, meaning “serving to find out, or discover.” In his Nobel prize–winning paper, Einstein (1905) used the term heuristic in its title to indicate an idea that he considered incomplete, given the limits of our knowledge, but useful.
From the Gestalt psychologists to Herbert Simon, and from machine learning to decision making (Payne et al., 1993), heuristic have been as positive tools that help when uncertainty is high, optimization out of reach and deadlines looming.
Heuristic refers to the cognitive process that generates a decision. A model of a heuristic describes the steps of this process.
In the simplest case, these include a search rule (what information is searched for in what order, inside or outside of memory), a stopping rule (when search is stopped and further information ignored), and a decision rule (how a decision is derived from the information found).
When the term heuristic became used in the cognitive illusions program around 1970, its meaning was changed in several respects. It was now used in problems that could be solved by probability theory. In these problems, a heuristic could per definition only result in bad judgments, and every single demonstration was negative.
In fact, the term heuristic and bias became almost synonymous and were used interchangeably. For instance, Jolls, Sunstein, and Thaler (1998) list both biases and the availability heuristic under the rubic “judgmental errors.”
Heuristics help people reduce the amount of work needed to collect and process the array of information related to making a decision. They greatly simplify our lives and usually yield fairly accurate judgments, but sometimes they can lead us astray and result in errors.
Although people use several heuristics, cognitive psychologists have identified three types of heuristics people commonly use in decisionmaking (often concurrently), including the representative, availability, and anchoring-and-adjustment heuristics.
The representative heuristic Opens in new window, which deals with biases when categorizing (perhaps random events or probabilities), potentially skew our judgment. Psychologists Scott Pious explains the ‘representative heuristic’ in his 1993 book The Psychology of Judgment and Decision Making by using the example of Linda, who is ‘committed to social justice’.
When a research group is asked to decide what’s more likely: ‘Linda is a bank teller’ or ‘Linda is a bank teller and an active feminist’ the majority pick the second option. While there are inevitably more bank tellers than there are feminist bank tellers, respondents have picked up on the words ‘social justice’ and ‘feminist’ and made an illogical connection.
A further example of the representative heuristic is a ‘gambler’s fallacy’ that past events change the probability of future results: a classic being the assumption that a run of roulette-wheel reds will continue (or be broken by a black) when the previous results have no influence on the next.
Of course, while the confident may assume their luck will continue, the under-confident will use the representative heuristic to support their conviction of poor luck.
The availability heuristic Opens in new window, according to Plous, concerns ‘availability’ – i.e., the information we use to assess the probability of an eventuality. It’s the availability heuristic that keeps people buying lottery tickets because big wins are big news, so they incorrectly assess the likelihood of their own win. And it’s the availability heuristic that induces fear of flying, again because crashes are big news and therefore seem more frequent.
Anchoring and Adjustment
Anchoring and adjustment heuristic Opens in new window involves making a judgment by starting from some initial point and then adjusting to yield a final decision. The initial point, known as the anchor, can come from the way a problem is framed, from historical factors, or from random information.
Even when an anchor is absurd and people recognize it as such, their subsequent judgments are often very close to that starting point (Dawes, 1988). This means that regardless of the initial anchor point, subsequent adjustments to be insufficient, thus resulting in bias information processing.
For example, some school systems categorize children into certain performance categories at an early age. Whereas a child anchored in a low-performance group might meet expectations, another child of similar ability but anchored in higher-performance category could be perceived as being a better performer simply s/he was categorized as being a high performer.
These three general heuristics represent ways in which people might simplify the decision-making process. As shown, these simplifications can result in specific types of biases. When we consider cultural variation and the role it plays in social cognition, we can anticipate systematic differences in how these heuristics are applied and the resulting biases. In reality, more than one of the heuristics might be used in any single decision. In addition, many other types of biases result from the use of these three heuristics or rules of thumb.
Developing Critical Thinking
Looking beyond heuristics is therefore an important part of developing strong judgment. Yet the world’s ever-growing complexity makes this increasingly difficult — leaving us more and more hostage to the knee-jerk (and usually negative) assessments that have been the average person’s burden since early childhood. One way round this — at least according to educational psychologists Richard W. Paul and Linda Elder — is to develop the tools for critical thinking.
In Critical Thinking, their landmark 2002 book, Paul and Elder describe critical thinkers (i.e. those with strong judgment) as having ‘intellectual virtues’ that reinforce good decisions. These include humility, courage, empathy, integrity, perseverance, ability to reason, autonomy (i.e. being capable of independent thought) and fair-mindedness.
And while this sounds like a tall order for the average person—blighted as they are by a lifetime of poor self-reinforcing judgments—in reality it’s little more than the application of Dweck’s growth mindset. It’s the journey towards good judgment that matters. And this can be rationalized by exploding judgment into its components.
According to Paul and Elder these are:
- Purpose — What are you seeking to achieve from a judgment?
- Point of view — From what perspective are you currently thinking?
- Assumptions — What assumptions are within your current thinking, and should these be examined?
- Implications — What are the likely consequences of any judgment?
- Information — What information is required and is it at hand?
- Inferences — What can be deduced from the information you already have?
- Concepts — What ‘principles’ or ‘theories’ (or even heuristics are at play here, and are they worth questioning?
- Questions — Indeed, what should you be asking yourself throughout the entire assessment process, and where will questions have to remain unanswered?
This might look like a lot to ask from anyone trying to improve their judgment. Yet critical thinking is in fact a natural process that, according to Paul and Elder, we develop from experience. By adopting the above rationalization we’re simply making ourselves aware of the process.
Critical thinking when applied to decision-making, say Paul and Elder, ‘enhances the rationality of decisions made by raising the pattern of decision-making to the level of conscious and deliberate choices.’
And if this sounds like a treatise for protracted decision-making, perhaps it should. Good decisions are made slowly — not least because rapid decisions are often fearful and reactive.