Primer Home / Cognitive bias / Summary on cognitive bias

Summary on cognitive bias

Topic: Cognitive bias
by Rishi, 2018 Cohort

Note: This entry was created in 2018, when the task was to “summarise a key reading”, and so may not represent a good example to model current primer entries on.

Cognitive Bias#

Cognitive bias can be defined as being where people adopt certain views despite a lack of evidence/logic to justify them (Haselton et al, The Evolution of Cognitive Bias). Understanding cognitive bias as a concept also requires an understanding of heuristics. Heuristics are mental shortcut methods that help us to make judgements or decisions efficiently whilst often sacrificing accuracy. How we apply these heuristic principles in order to make decisions depends on our beliefs, values and past experiences. These subjective variables are what ultimately inform our cognitive biases. Often, these biases and reliance on heuristic shortcuts leads to systematic errors of judgement.

Applying the Representativeness heuristic#

We as humans often make assumptions based on how representative A is in relation to B. For example, when we stereotype, we make predictions such as, if person A wears a pinstripe suit and carries a leather briefcase, then person A belongs to category B (Lawyer). Categories, such as “Lawyer” are created based on our past experience of discovering that generally, lawyers wear pinstripe suits and carry leather briefcases. The danger of this blinkered method of thinking can be illustrated by Tversky and Kahneman’s research (Judgment under Uncertainty: Heuristics and Biases) in which they provided personality descriptions of 100 professionals, who were either Engineers or Lawyers, and told subjects to categorise each description. They told one group that out of the 100 people, 70 were Lawyers and 30 Engineers and another group the opposite. Despite receiving this information, subjects ignored the prior probabilities and both tests produced very similar results. Here it can be seen that relying purely on intuition and disregarding the concrete information such as prior probability may not always be the correct approach. It could be suggested that relying on intuition is analogous to taking a non-systems thinking approach to solving problems you assume that the first solution you can think of is the best and most practical one before considering unintended consequences, or in the case of the representative heuristic, before considering prior probabilities. However, it is also important to note that even systems thinkers must actively notice and be wary of their own intuitions and biases when tackling complex problems.

Biases and Validity of Information#

When we make predictions or assumptions based on information given to us, we often do not question the accuracy of that information and apply it regardless due to our pre-existing biases. For example, if you received information that there will be a full moon tonight, you may accept it as truth just because you havent seen a full moon this month so far (your personal bias) and in doing so, fail to question the source of that information and whether it is accurate. This bias often persists even when someone is aware of the poor quality of information provided and we are often particularly confident in a prediction/in information if it is representative of the input (Tversky & Kahneman). Applying this to complexity, we can learn from this way of thinking by being reluctant to accept fact just because it aligns with what we know/think we know before. I believe this also links into a point that has been discussed in class; that with information being so freely accessible, perhaps we need to be more discerning when we receive information/knowledge, especially when applying it complex problems.

Parting questions#

  • Are biases necessarily a problem?
  • What other ways could we link the concept of cognitive bias to dealing with complexity?
  • What is the importance of being aware of your own/others biases?

Explore this topic further#

Return to Cognitive bias in the Primer

Disclaimer#

This content has been contributed by a student as part of a learning activity.
If there are inaccuracies, or opportunities for significant improvement on this topic, feedback is welcome on how to improve the resource.
You can improve articles on this topic as a student in "Unravelling Complexity", or by including the amendments in an email to: Chris.Browne@anu.edu.au

Note: This entry was created in 2018, when the task was to “summarise a key reading”, and so may not represent a good example to model current primer entries on.

Cognitive Bias#

Cognitive bias can be defined as being where people adopt certain views despite a lack of evidence/logic to justify them (Haselton et al, The Evolution of Cognitive Bias). Understanding cognitive bias as a concept also requires an understanding of heuristics. Heuristics are mental shortcut methods that help us to make judgements or decisions efficiently whilst often sacrificing accuracy. How we apply these heuristic principles in order to make decisions depends on our beliefs, values and past experiences. These subjective variables are what ultimately inform our cognitive biases. Often, these biases and reliance on heuristic shortcuts leads to systematic errors of judgement.

Applying the Representativeness heuristic#

We as humans often make assumptions based on how representative A is in relation to B. For example, when we stereotype, we make predictions such as, if person A wears a pinstripe suit and carries a leather briefcase, then person A belongs to category B (Lawyer). Categories, such as “Lawyer” are created based on our past experience of discovering that generally, lawyers wear pinstripe suits and carry leather briefcases. The danger of this blinkered method of thinking can be illustrated by Tversky and Kahneman’s research (Judgment under Uncertainty: Heuristics and Biases) in which they provided personality descriptions of 100 professionals, who were either Engineers or Lawyers, and told subjects to categorise each description. They told one group that out of the 100 people, 70 were Lawyers and 30 Engineers and another group the opposite. Despite receiving this information, subjects ignored the prior probabilities and both tests produced very similar results. Here it can be seen that relying purely on intuition and disregarding the concrete information such as prior probability may not always be the correct approach. It could be suggested that relying on intuition is analogous to taking a non-systems thinking approach to solving problems you assume that the first solution you can think of is the best and most practical one before considering unintended consequences, or in the case of the representative heuristic, before considering prior probabilities. However, it is also important to note that even systems thinkers must actively notice and be wary of their own intuitions and biases when tackling complex problems.

Biases and Validity of Information#

When we make predictions or assumptions based on information given to us, we often do not question the accuracy of that information and apply it regardless due to our pre-existing biases. For example, if you received information that there will be a full moon tonight, you may accept it as truth just because you havent seen a full moon this month so far (your personal bias) and in doing so, fail to question the source of that information and whether it is accurate. This bias often persists even when someone is aware of the poor quality of information provided and we are often particularly confident in a prediction/in information if it is representative of the input (Tversky & Kahneman). Applying this to complexity, we can learn from this way of thinking by being reluctant to accept fact just because it aligns with what we know/think we know before. I believe this also links into a point that has been discussed in class; that with information being so freely accessible, perhaps we need to be more discerning when we receive information/knowledge, especially when applying it complex problems.

Parting questions#

  • Are biases necessarily a problem?
  • What other ways could we link the concept of cognitive bias to dealing with complexity?
  • What is the importance of being aware of your own/others biases?
bars search times arrow-up