Cognitive Biases

I think therefore I am.

Are you sure? You may think you know what you’re thinking, and
why you’re thinking it. Chances are you’re wrong. You are not what you think
either. You may not even be the one doing the thinking.

More and more, we’re discovering that thought is not just an exercise in
rational, logical deduction. It’s not that we’re not smart, or smart enough.
Intelligence, education, conscientiousness – they’re not decisive. We all
have our biases. Life often comes down to guess-work. We like to simplify
what’s before us.

We base our reactions on heuristics – short-cuts, rules-of-thumb,
reckonings. When we’re faced with problems that don’t offer obvious and
immediate solution, these habits of mind take over.

We’re for the most part unconscious of these cognitive biases. However,
just knowing that we have biases can guide us to be more fair, more honest
and more insightful.

Cognitive Biases were popularized by Nobel prize winning Daniel Kahneman
and his research partner Amos Tversky. More recently, the economist Richard
Thaler won a Nobel for his work in behavioural economics. Books such as Dan
Ariely’s ‘Predictibly Irrational’, Michael Lewis ‘Moneyball,’ and Malcolm
Gladwell’s ‘Blink’ drew attention to the area.

My favourite is Zero-Sum Bias.

‘Your gain comes at my expense!!’

Its baleful effects may be witnessed worldwide.

Examples of Cognitive Bias




Cognitive Bias poster* 200+ biases 

type a type b

(click to enlarge – Note: file size  approx 6 mb for hi-res 24X36 inch images)

The Cognitive Bias poster visually organizes 220 + cognitive biases. The list is based on Buster Benson’s blog ‘Better Humans.’  It also builds on John Manoogian’s poster codex. John cites 180 biases.

As Buster suggests, we can organize these biases by looking at why we have them in the first place. He sees our biases as adaptive responses to four basic problems we face in life:

  • We often have information overload and we’re unsure what to focus on
  • What we meet with often doesn’t make immediate sense
  • Sometimes we’re under pressure to act fast so an opportunity does not escape us
  • It’s not always obvious what we need to keep in mind or  to remember for later

We can sort cognitive biases into different types.

Decision Making, Belief Bias Memory Errors  Social Attribution Bias
These biases strongly affect how we form beliefs. They shape our estimates of probability and our take on human behaviour generally.

Example: Confirmation Bias.

We tend to seek out information in a way that confirms our preconceptions.

I don’t read that newspaper – it’s biased.

Heard of filter bubbles? Social media have amplified these to monstrous proportions.

A memory is not a mental photograph: it is a story we tell. We alter memories to suit our needs too. We have memories that we don’t consciously recall.

Example: Frequency illusion

Right after we have just learned or noticed something,  we start seeing it everywhere.

Violent crime is on the increase.

Data say otherwise. Divisive politicians repeatedly cite one extreme crime so that its exaggerated effect twists our perceptions.

We constantly make attributions about our own behaviour, and other peoples’ behaviour too. These actually affect our perceptions, which in turn lead to even more biased interpretations.

Example: In-group Bias

We tend to give preferential treatment to those we perceive to be members of our own groups.

I wonder how my job interview will go?

If you are a person with a non WASP name, maybe you never received the call.

If you like what you see, copies of the poster can be bought here.



Cognitive Bias examples.

Below are samples of the 200 + biases on the posters.

Attentional bias  A tendency of our perception to be affected by our recurring thoughts.
Attribute substitution  Given a difficult and novel problem, we reach for a more familiar, related problem that we can deal with.
Authority bias  Attributing greater accuracy to the opinion of an authority figure  – unrelated to its content – and be more influenced by that opinion.
Automation bias  Excessive dependency on automated systems which can lead to erroneous automated information overriding correct decisions.
Availability heuristic  Overestimating the likelihood of events because of how recent or ‘available’  a memory is, or how unusual or emotionally charged it may be.
Clustering illusion  Overestimating the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).
Confirmation bias  When we search for, interpret, focus on and remember information in ways that confirm our preconceptions.
Congruence bias  The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.
Disposition effect  A tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value.
Distinction bias  Viewing two options as more dissimilar when evaluating them simultaneously than when we do separately.
Dunning–Kruger effect  A tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.
Duration neglect  The neglect of the duration of an episode in determining its value.
Exposure suspicion  How a knowledge of a subject’s disease in a medical study may influence the search for causes.
Extremes aversion  We’re more likely to choose an option if it is the intermediate choice, rather than an extreme one.
Extrinsic incentives bias  Viewing others’ motivations as situational-based while viewing one’s own as dispositional, or intrinsic.
Fading affect bias  A bias in which the emotion associated with unpleasant memories fades more quickly than the emotion associated with positive events.
False consensus effect  The tendency for people to overestimate the degree to which others agree with them.
Functional fixedness  Limits a person to using an object only in the way it is traditionally used.
Fundamental attribution error  A tendency to over-emphasize personality-based explanations for behaviours in others while under-emphasizing  situational influences.
Hawthorne effect  When a researcher overlooks that a subject’s response changes due to awareness of being observed.
Herd Instinct  Adopting the opinions and following the behaviours of the majority, to feel safer and to avoid conflict.
Hindsight bias  Viewing past events as being more predictable than they actually were; also called the “I-knew-it-all-along” effect.
Horn effect  When one’s perception of another is unduly influenced by a single negative trait.
Irrational escalation  Justifying increased investment based on prior investment, despite new evidence that the decision may be wrong. aka Sunk Cost Fallacy.
Just-world hypothesis  Believing that the world is fundamentally just, causing us to rationalize an otherwise inexplicable injustice is deserved by the victim.
Lag effect  Learning is greater when studying is spread out over time, as opposed to studying the same amount of time in a single session. – see  Spacing effect.
Loss aversion  The disutility of giving up an object is greater than the utility associated with acquiring it. (see  Sunk cost effects and Endowment effect)
Magic number 7 + 2 bias  The maximum number of chunks of information a person can hold in working memory at the same time (Miller’s Law).
Misinformation effect  Memory becoming less accurate because of interference from post-event information.
Mere exposure effect  Expressing undue liking for things merely because of familiarity with them.
Modality effect  Recall is higher for the last items of a list when the list items were spoken than when they were read.
Neglect of prior base rate  Failing to incorporate prior known probabilities which are pertinent to the decision at hand.
Neglect of probability  The tendency to completely disregard probability when making a decision under uncertainty.
Next-in-line effect  A person in a group has diminished recall for the words of others who spoke immediately before him or her,  if they take turns speaking.
Normalcy bias  The refusal to plan for, or react to, a disaster which has never happened before.
Projection bias n Overestimating how much our future selves share one’s current preferences, thoughts and values, thus leading to sub-optimal choices.
Pseudocertainty effect  A tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
Reactance  An urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to limit your freedom (Reverse psychology).
Semmelweis reflex  Rejecting new evidence that contradicts a paradigm.
Serial position effect  When one recalls the first and last items in a series best, and the middle items worst.
Social desirability bias  We over-report socially desirable characteristics or behaviours about ourselves, and under-report the undesirable.
Spacing effect  That information is better recalled if exposure to it is repeated over a long span of time rather than a short one.
Trait ascription bias  Viewing oneself as relatively variable in terms of personality, behaviour, and mood while viewing others as much more predictable.
Travis Syndrome  Overestimating the significance of the present as being necessarily more significant or developed than in the past.
Verbatim effect  The “gist” of what someone has said is better remembered than verbatim wording. This is because memories are not copies, but reconstructions.
Von Restorff effect  That an item that sticks out is more likely to be remembered than other items
Weber–Fechner law n Difficulty in comparing small differences in large quantities.
Well travelled road effect  Underestimating the time taken to traverse oft-traveled routes and overestimating time taken to traverse less familiar routes.
Woozle effect  When frequent citation of previous publications that actually lacked evidence misleads us into believing there is evidence to support a belief.
Worse-than-average effect  A tendency to believe ourselves to be worse than others at tasks which are difficult.
Zero-sum bias  A bias whereby a situation is incorrectly perceived to be like a zero-sum game (i.e., one person gains at the expense of another).

(Return to top)