Cognitive Biases

I think therefore I am.

(SKIP to posters)

Are you sure?

Yogi Berra was on to something: ‘In theory there is no difference between theory and practice. In practice there is.’

You may think you know what you’re thinking, and why you’re thinking it. Chances are you’re wrong. You are not just what you think either. You may not even be the one doing the thinking. We inherit and absorb and mimic:  our evolutionary history has made central to our beings.

Our thinking is not a series of exercises in rational, logical deduction. It’s not that we’re not smart enough to conduct our lives that way, or smart enough. Intelligence, education, conscientiousness – they’re not decisive. Life comes down to guess-work.

It’s not always a choice

We don’t just like or prefer to simplify what’s before us – we need to. That’s because we work with limited resources: time, information, energy, education, aptitude, physical powers. Theorists call these limitations ‘bounded rationality.’

Our reactions and perceptions rely on  heuristics – short-cuts, rules-of-thumb, reckonings. We’re for the most part unconscious of our cognitive biases. Unscrupulous agents continually try to tailor environments in order to induce us to rely on biases that favour their interests.

Likewise, systems seek equilibrium, or entropy. Our minds are no different.  Convenience, ease, laziness  – they all exert a powerful force on us. The pressures to be compliant and agreeable are such that we smother aspects of ourselves in order to get along socially.

Adulthood can be described as a process of automating our minds to accommodate social pressures, and to police ourselves into compliance. So subtle and so forceful are these pressures that we’re not aware that we have internalized the process – it feels normal, even highly desirable. It often takes a crisis to reveal to ourselves what we think, and why.

“If all you have is hammers, all you’ll see is nails.”

Habit inures us to acting and reacting in patterns. Our neuroplastic brains reinforce these habits such that they take over when we are faced with problems that don’t look familiar, or offer obvious and immediate solution. We’re innately comparative: we immediately use tried and trusted models on what appear to be new problems.

Our biases /  heuristics are not set in stone. Different environments, in time and space and stimuli, alter our guesswork in varying degrees.

We’re all doomed.

It is easy to be cynical, or misanthropic. A Recent article in the New Scientist (‘End of Days..?’) suggests we may well be in such a moment that the ‘non-thinking’ mindsets of apparently modern societies are leading us to ruin.

‘….Cognitive scientists recognise two broad modes of thought – a fast, automatic, relatively inflexible mode, and a slower, more analytical, flexible one. Each has its uses, depending on the context, and their relative frequency in a population has long been assumed to be stable.

David Rand, a psychologist at Yale University, though, argues that populations might actually cycle between the two over time. Say a society has a transportation problem. A small group of individuals thinks analytically and invents the car. The problem is solved, not only for them but for millions of others besides, and because a far larger number of people have been relieved of thinking analytically – at least in this one domain – there is a shift in the population towards automatic thinking. This happens every time a new technology is invented that renders the environment more hospitable. Once large numbers of 5 people use the technology without foresight, problems start to stack up. Climate change resulting from the excess use of fossil fuels is just one example.

Others include overuse of antibiotics leading to microbial resistance, and failing to save for retirement. Jonathan Cohen, a psychologist at PrincetonUniversity who developed the theory with Rand, says it could help solve a long-standing puzzle regarding societies heading for ruin:why did they keep up their self-destructive behaviour even though the more analytical people who have grown up in a turbulent society tend to have children who renounce violence people must have seen the danger ahead? “The train had left the station,” says Cohen, and the forward-thinking folk were not steering it. This is the first time anyone has attempted to link the evolution of societies with human psychology, and the researchers admit their model is simple, for now….’

We’re not all doomed.

It is easy to be pessimistic, even cynical and misanthropic. But, are we merely gormless stimulus-response organisms, perpetually on the make for the easy way out – less cognitive effort and energy expended? No. We have, and have always had, guardians of our minds to call upon.

Meta-cognition:  we can think about how we think, and thereby alter both. Merely knowing that we have biases can guide us to be more fair, more honest and more insightful.

Neuroplasticity: we learn and adapt and innovate, from birth to death. Conditioned responses and inertia can be overcome.

Contrariness: we have an innate contrariness, a what-if mind that refuses to accept assumptions around us. Analytical thought, rational thought, needs to be explicitly modelled and taught. Calling ourselves out for our own cognitive biases is a good place to start. The truth may hurt – the alternative will hurt more.

One bias to rule them all?

Of all biases, I’d put Zero-Sum Bias at the top.

“Your gain necessarily means my loss.” 

Its baleful effects can be witnessed worldwide in all ages.

New Scientist index for Cognitive Bias articles:  New Scientist


Cognitive Biases were popularized by Nobel prize winning Daniel Kahneman and his research partner Amos Tversky. More recently, the economist Richard Thaler won a Nobel for his work in behavioural economics. Books such as Dan Ariely’s ‘Predictibly Irrational’, Michael Lewis ‘Moneyball,’ and Malcolm Gladwell’s ‘Blink’ drew attention to the area.

(Skip down to examples of Cognitive Bias)


Three Types of Cognitive Bias 


Decision Making, Belief Bias

These biases strongly affect how we form beliefs. They shape our estimates of probability and our take on human behaviour generally.

Example: Confirmation Bias.

We tend to seek out information in a way that confirms our preconceptions.

‘I don’t read that newspaper – it’s biased.’

Heard of filter bubbles? Social media have amplified these to monstrous proportions.

Memory Errors 

A memory is not a mental photograph: it’s a story we tell. We alter that story to suit our needs too. We also have memories that we don’t consciously recall.

Example: Frequency illusion

Right after we have just learned or noticed something,  we start seeing it everywhere.

‘Violent crime is on the increase.’

Data say otherwise. Divisive politicians repeatedly cite one extreme crime so that its exaggerated effect twists our perceptions.

Social Attribution Bias

We constantly make attributions about our own behaviour, and other peoples’ behaviour too. These actually affect our perceptions, which in turn lead to even more biased interpretations.

Example: In-group Bias

We tend to give preferential treatment to those we perceive to be members of our own groups.

‘I wonder how my job interview will go?’

If you are a person with a non WASP name, maybe you never received the call.


Examples of Cognitive Bias 


Below are samples of the 220 cognitive biases on the poster.

Attentional bias  A tendency of our perception to be affected by our recurring thoughts.
Attribute substitution  Given a difficult and novel problem, we reach for a more familiar, related problem that we can deal with.
Authority bias  Attributing greater accuracy to the opinion of an authority figure  – unrelated to its content – and be more influenced by that opinion.
Automation bias  Excessive dependency on automated systems which can lead to erroneous automated information overriding correct decisions.
Availability heuristic  Overestimating the likelihood of events because of how recent or ‘available’  a memory is, or how unusual or emotionally charged it may be.
Clustering illusion  Overestimating the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).
Confirmation bias  When we search for, interpret, focus on and remember information in ways that confirm our preconceptions.
Congruence bias  The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.
Disposition effect  A tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value.
Distinction bias  Viewing two options as more dissimilar when evaluating them simultaneously than when we do separately.
Dunning–Kruger effect  A tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.
Duration neglect  The neglect of the duration of an episode in determining its value.
Exposure suspicion  How a knowledge of a subject’s disease in a medical study may influence the search for causes.
Extremes aversion  We’re more likely to choose an option if it is the intermediate choice, rather than an extreme one.
Extrinsic incentives bias  Viewing others’ motivations as situational-based while viewing one’s own as dispositional, or intrinsic.
Fading affect bias  A bias in which the emotion associated with unpleasant memories fades more quickly than the emotion associated with positive events.
False consensus effect  The tendency for people to overestimate the degree to which others agree with them.
Functional fixedness  Limits a person to using an object only in the way it is traditionally used.
Fundamental attribution error  A tendency to over-emphasize personality-based explanations for behaviours in others while under-emphasizing  situational influences.
Hawthorne effect  When a researcher overlooks that a subject’s response changes due to awareness of being observed.
Herd Instinct  Adopting the opinions and following the behaviours of the majority, to feel safer and to avoid conflict.
Hindsight bias  Viewing past events as being more predictable than they actually were; also called the “I-knew-it-all-along” effect.
Horn effect  When one’s perception of another is unduly influenced by a single negative trait.
Irrational escalation  Justifying increased investment based on prior investment, despite new evidence that the decision may be wrong. aka Sunk Cost Fallacy.
Just-world hypothesis  Believing that the world is fundamentally just, causing us to rationalize an otherwise inexplicable injustice is deserved by the victim.
Lag effect  Learning is greater when studying is spread out over time, as opposed to studying the same amount of time in a single session. – see  Spacing effect.
Loss aversion  The disutility of giving up an object is greater than the utility associated with acquiring it. (see  Sunk cost effects and Endowment effect)
Magic number 7 + 2 bias  The maximum number of chunks of information a person can hold in working memory at the same time (Miller’s Law).
Misinformation effect  Memory becoming less accurate because of interference from post-event information.
Mere exposure effect  Expressing undue liking for things merely because of familiarity with them.
Modality effect  Recall is higher for the last items of a list when the list items were spoken than when they were read.
Neglect of prior base rate  Failing to incorporate prior known probabilities which are pertinent to the decision at hand.
Neglect of probability  The tendency to completely disregard probability when making a decision under uncertainty.
Next-in-line effect  A person in a group has diminished recall for the words of others who spoke immediately before him or her,  if they take turns speaking.
Normalcy bias  The refusal to plan for, or react to, a disaster which has never happened before.
Projection bias n Overestimating how much our future selves share one’s current preferences, thoughts and values, thus leading to sub-optimal choices.
Pseudocertainty effect  A tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
Reactance  An urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to limit your freedom (Reverse psychology).
Semmelweis reflex  Rejecting new evidence that contradicts a paradigm.
Serial position effect  When one recalls the first and last items in a series best, and the middle items worst.
Social desirability bias  We over-report socially desirable characteristics or behaviours about ourselves, and under-report the undesirable.
Spacing effect  That information is better recalled if exposure to it is repeated over a long span of time rather than a short one.
Trait ascription bias  Viewing oneself as relatively variable in terms of personality, behaviour, and mood while viewing others as much more predictable.
Travis Syndrome  Overestimating the significance of the present as being necessarily more significant or developed than in the past.
Verbatim effect  The “gist” of what someone has said is better remembered than verbatim wording. This is because memories are not copies, but reconstructions.
Von Restorff effect  That an item that sticks out is more likely to be remembered than other items
Weber–Fechner law n Difficulty in comparing small differences in large quantities.
Well travelled road effect  Underestimating the time taken to traverse oft-traveled routes and overestimating time taken to traverse less familiar routes.
Woozle effect  When frequent citation of previous publications that actually lacked evidence misleads us into believing there is evidence to support a belief.
Worse-than-average effect  A tendency to believe ourselves to be worse than others at tasks which are difficult.
Zero-sum bias  A bias whereby a situation is incorrectly perceived to be like a zero-sum game (i.e., one person gains at the expense of another).


Cognitive Bias poster* 200+ biases 


5 Column version

 

1 column version

(click to view PDF full size in your browser –  file sizes 2 mb and 1 mb for these hi-res images)

The Cognitive Bias poster visually organizes 220  cognitive biases. The list is based on Buster Benson’s blog ‘Better Humans.’  It also builds on  John Manoogian’s poster codex. John, protean polymath,  cites 180 biases in his gorgeous poster.

As Buster suggests, we can organize these biases by looking at why we have them in the first place. He sees our biases as adaptive responses to four basic problems we face in life:

  • We often have information overload and we’re unsure what to focus on
  • What we meet with often doesn’t make immediate sense
  • Sometimes we’re under pressure to act fast so that an opportunity does not escape us
  • It’s not always obvious what we need to keep in mind or  to remember for later

 

(Return to top)