Despite its incredible abilities, our brain is often fooled into making seemingly irrational decisions because of certain biases in the way it processes information. Decision making is complex, so we take mental shortcuts based on our emotions, experience or just the way information is framed. We tend to see patterns where there aren't any (clustering illusion), be overly optimistic about our own abilities (overconfidence bias), follow the judgement of others (bandwagon effect) and so on. Scientists regularly remind us of the many ways cognitive biases interfere with the choices we make.
How does cognitive bias affect decision making?
It can cloud our judgement and lead to disastrous choices. Cognitive bias has practical ramifications beyond private life, extending to professional domains including business, military operations, political policy, and medicine. Some of the clearest examples of the effects of bias on consequential decisions feature the influence of confirmation bias on military operations. Confirmation bias - that is, the tendency to conduct a biased search for and interpretation of evidence in support of our hypotheses and beliefs - has contributed to the downing of Iran Air Flight 655 in 1988 and the decision to invade Iraq in 2003.
So are we doomed to make terrible decisions?
Ever since Daniel Kahneman and Amos Tversky formalized the concept of cognitive bias in 1972, most empirical evidence has given credence to the claim that our brain is incapable of improving our decision-making abilities. However, our latest field study, published by Psychological Science in September 2019, suggests that a one-shot de-biasing training can significantly reduce the deleterious influence of cognitive bias on decision making. We conducted our experiment in a field setting that involved 290 graduate business students at HEC Paris. In our experiment, a single training intervention reduced biased decision making by almost a third.
How much does (or could) this improve decision making?
The results of our paper - led by Professors Anne Laure Sellier (HEC Paris), Irene Scopelliti (City University of London) and Carey K. Morewedge (Boston University) –establish a clear link between cognitive bias reduction training and improved judgment/decision-making abilities in a high-risk managerial context. Our results could have far-reaching consequences for everyday choices, but also for crucial and high-stakes decisions. At a military level, it could help avoid some of the deadly errors the US Armed Forces committed in the past. As American educator Ben Yagoda pointed out in his compelling article in The Atlantic last year, without confirmation bias, the US may not have believed Iraq possessed weapons of mass destruction and decided to invade Iraq in 2003. As the official 2005 report to George W. Bush put it: “The disciplined use of alternative hypotheses could have helped counter the natural cognitive tendency to force new information into existing paradigms.”
The results of our paper establish a clear link between cognitive bias reduction training and improved judgment/decision-making abilities in a high-risk managerial context.
Which particular biases can be attenuated and how?
Our research focuses on one particular training intervention, which had produced large and long-lasting reductions of confirmation bias, correspondence bias, and the bias blind spot in the laboratory. Our intervention was originally created for the Office of the Director of National Intelligence and was designed to reduce bias in US government intelligence analysts.
The intervention involved playing a serious game that gives players personalized feedback and coaching on their susceptibility to cognitive biases. The training elicited biases from players during game play, and then defined each bias. It gave examples of how each bias influenced decision making in professional contexts (e.g., intelligence and medicine), explained to participants how their choices may have been influenced by the biases, and provided participants with strategies to avoid bias and practice opportunities to apply their learning to new problems.
How exactly did you train the participants in your study?
Before or after they played the serious game, students from three Master’s programs at HEC Paris were asked to crack Carter Racing, a complex business case modelled on the fatal decision to launch the Space Shuttle Challenger, which disintegrated a few minutes after take-off in 1986. Each participant acted as the lead of an automotive racing team making a high-stakes, go/no-go decision: remain in a race or withdraw from it. The case is designed so that its surface features suggest the team should race, but careful analysis of the case evidence reveals that racing will have catastrophic consequences for the team. We measured the effects of cognitive bias reduction training to see if the intervention improved decision making in the case. Would trained participants decide to race, or not? Crucially, trainees were not aware that their decision making would be examined for bias.
Can such training truly improve judgement?
The results were promising. Participants trained before completing the case were 29% less likely to choose the inferior hypothesis-confirming solution (i.e., to race) than participants trained after completing the case. This result held when we controlled for individual differences including gender, work experience, GMAT scores, GPA, and even participants propensity for cognitive reflection (i.e., their tendency to override an incorrect “gut” response and engage in further reflection leading up to a correct answer). Our analyses of participants’ justifications for their decisions suggest that their improved decision making was driven by a reduction in confirmatory hypothesis testing. Trained participants generated fewer arguments in support of racing—the inferior case solution—than did untrained participants.
Our results provide encouraging evidence that training can improve decision making in the field, generalizing to consequential decisions in professional and personal life. Trained participants were more likely to choose the optimal case solution, so training improved rather than impaired decision making.
How applicable are your (lab-tested) results in the wider world?
Of course, our findings are limited to a single field experiment. More research is needed to replicate the effect in other domains and to explain why this game-based training intervention transferred more effectively than have other forms of training tested in past research. Games may be more engaging than lectures or written summaries of research findings. The game also provided intensive practice and personalized feedback, which is another possibility. A third possibility is the way the intervention taught players about biases. Training may be more effective when it describes cognitive biases and how to mitigate them at an abstract level, and then gives trainees immediate practice testing out their new knowledge on different problems and contexts.
Games may be more engaging than lectures or written summaries of research findings.
People have been debating how to overcome the many ways in which we deviate from rationality well before the concept of cognitive bias was first coined over six decades ago. The general conclusion has been that decision making cannot be improved within persons, and the only way to reduce bias is through changes to the environment like nudges. In September 2018, Nobel laureate Daniel Kahneman said, “You can’t improve intuition. Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning… Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument, the rules go out the window.”
We believe our results show, fortunately, that this conclusion may be premature. Training appears to be a scalable and effective intervention that can improve decisions in professional and personal life.