Cognitive Bias: 13 Reasons Your Risk Radar Might Have Gone Awry
While we like to think of ourselves as rational beings – and economic theory assumes in its models that we are – the reality of human decision making is quite different. That is not to say that in general humans are stupid or irrational. Rather, there are different “modes” of decision making, which nobel-prize winning behavioral economist Daniel Kahneman [1] calls “fast” and “slow" thinking. Fast or System 1 thinking is intuitive and heuristics-based, whereas slow, System 2 thinking is the rational one. We do most of our decision making in System 1, and it’s good that we do: slow thinking is too, well, slow, for normal, everyday use. Not only that, it takes up much more resources, in time, energy and information necessary. So not deciding everything rationally is a good thing.
The success of the heuristic approach to decision-making depends on correct application of lived experience, and can appear quite “magical” at times. Think of a fireman having a “sixth sense” of the precise moment and location of where the floor of a building will be collapsing – he has heard the warning signs hundreds of times before and without conscious thought puts together the pattern of destruction. Or an experienced GP [2] who can tell with a few glances (and sometimes sniffs) at her patients what's wrong with them.
Issues with the intuitive approach can arise in several ways. For one, it is inherently practical, i.e. tied to things that can be experienced, not abstract concepts and complex issues that often arise in modern world decision making. Secondly, it is prone to systematic errors in judgement, biases that affect our thinking and lead our built-in heuristics astray. These cognitive biases can also affect our slow thinking, but only if we let them.[3] Being aware that we have these biases and what they are will allow us to improve our (slow) decision making.
Psychologists have catalogued more than 100 more or less distinctive biases that affect our thinking and decision-making. Today, I’ll just list the top thirteen that affect our risk taking and risk decisioning. Please be aware that the list itself, and its order, is subjective. Anybody who disagrees may do so in the comments.
Confirmation bias. Arguably the most common (and dangerous) bias, confirmation bias causes us to seek out facts that confirm our views and ignore those that contradict it. It’s incredibly hard to shake confirmation bias, and even trained scientists fall prey to it when the data they collect contradicts their favorite theory. As it causes us to – purposefully or subconsciously – leave out a subset of data, decision-making is automatically suboptimal. Echo chambers couldn’t exist without it. Confirmation bias can be seen as a protective mechanism against cognitive dissonance.
Survivorship bias. This is a type of sampling error that only looks at data from successful ventures (the survivors) instead of taking both successes and failures into account and analyzing the factors for both. History is full of examples of this bias in action; one of the more famous is when in WW2, the allied command examined where returning planes had been hit by enemy fire and proposed strengthening those areas.
Agent detection. In an earlier post about COVID-19 I wrote about Type I errors and how they might have evolved as a persistent trait. Type I errors are not a cognitive bias on their own, rather they happen as a combination of biases; specifically the previous two and agent detection and its two cousins anthropocentric thinking and anthropomorphisms. These describe our tendency to think about everything in human terms – like ascribing rational thought to animals and seeing patterns and intent behind even the most random occurrences. Understanding when noise is random, and when there is a pattern, is important for correctly assessing risks and rewards in decision-making.
Anchoring. This is the tendency to rely on one piece of information (usually the first one acquired) above all others when making decisions under uncertainty. Marketing relies on this cognitive bias pretty heavily. Anchoring can go so far that test subjects walk slower after a test where they were “fed” words associated with age or illness.
Default effect. When faced with a set of decision options, we will often chose to do nothing, i.e. the default. For instance, countries that make organ donation consent the default from which the individual has to opt out have much more donors that countries where individuals expressly have to opt in. Similar approaches apply to subscription models with automatic vs. explicit renewals.
Loss aversion. In utility terms, most people perceive the loss of an object as worse than the previous gain of receiving it. This is related to other effects like the endowment effect (people will demand more to give up something than they are will to pay to acquire it) and the sunk cost effect (e.g. persisting in a project that has no prospective future value). In rational decision making, only future risks and rewards are relevant, everything else is “water under the bridge”.
Dunning-Kruger-Effect. Ever read of the survey where 80% of drivers thought they were better than the average? That’s a result of the Dunning-Kruger-Effect: less skilled people tend to overestimate their abilities relative to their peers, whereas highly skilled are the opposite. It’s cousin is the Hard-Easy-Effect: we tend to overestimate our ability to master hard problems, and underestimate the easy ones. In high risk situations, the two can be a potentially deadly combination.
Gambler’s fallacy. The erroneous belief that past events influence future events in random settings. “The roulette wheel came up red five times in a row, so black is now due.” The difficulty in recognizing this one in oneself is that sometimes past events, i.e. priors, do matter for future risk calculation and decisions. Related is the “hot-hand fallacy”: the believe that if somebody had success in a random event they will be more likely to have success again. Both will mess up your risk radar.
Egocentric bias. There are several instances of this group of biases, such as the illusion of control (overestimating one’s influence on events), the illlusion of validity (overestimating the accuracity of one’s judgements), or the restraint bias (overestimating one’s ability for restraint when faced with future temptation). We all know people who behave like that, but we ourselves would never fall prey to it, of course we wouldn’t. If you just thought that last sentence, that is called “illusory superiority” due to a “bias blind spot” – all egocentric biases.
Neglect of probability. Making decisions under uncertainty is tough, but it’s important to make best estimates of the probabilities of risks and outcomes. This cognitive bias describes the tendency to forego estimating probabilities, i.e. ignore probability altogether.
Normalcy bias. This bias is, or can be, a result of cognitive dissonance. A disaster or other bad outcome that has never happened (to oneself) before is ignored, or refused to plan for. COVID-19 denial comes to mind here, although, as discussed, that can have other causes, too.
Projection bias. We don’t really understand ourselves over time and how we change. Projection bias is our tendency to overestimate how similar to today we will be in the future regarding our preferences and values, thus leading to decisions that aren’t optimal for our future selves.
Zero-risk bias. This bias runs through various of our discussion on this blog, especially regarding risk mitigation and residual risk. It is the tendency to prefer complete elimination of a (sub-) risk, instead of a much greater reduction in overall risk.
As I said, this is just a subjective selection, and there are many more biases, some of which influence risk taking negatively. Depending on future topics, we might revisit or do a deeper dive into one of the above. Stay tuned.
Notes:
[1] If there is one book aspiring and experienced risk takers and decision makers should read, Mark and I agree it’s Kahneman’s “Thinking Fast and Slow”. Link up in the bibliography
[2] For the US readers: primary care physician
[3] It can be argued that being aware of our biases will also improve our fast thinking. Probably true, but that will be much harder.