Missed last week's London Evening Lecture? Read the review on 'Uncertainty and Cognitive Biases' here…
Review by Tracey Dancy, Dancy Dynamics
“Our ability to problem solve is immense, and our adaptation has led to instinctive, intuitive, experience and emotional based techniques for problem solving and decision-making. The majority of times these methods serve us well and lead to a satisfactory outcome. However, they also lead to irrational behaviours, misguided interpretations and poor decisions, otherwise known as cognitive biases”. Marc Bond, of Rose and Associates, used this premise as his basis for the PESGB London Evening Lecture on 14th July, and went on to describe how our cognitive biases can negatively impact our judgements and decision making, before offering some solutions to mitigate their influence.
In introducing the lecture, Marc firstly discussed the terms Risk and Uncertainty – words we use a lot in our industry, but often use interchangeably. Marc explained the difference: Risk is the threat of a chosen action or event leading to a loss or unwanted outcome, while uncertainty is having limited knowledge, expressed as a range of possible outcomes. Humans are very good at understanding risk, but poor at understanding uncertainty, particularly in more complex situations. Evolution favours our ability to understand risk, but not so much uncertainty; this is exacerbated in that we are making decisions in a much more complex environment.
So what impact does this concept have on us? Marc showed examples of familiar tests that look at the limitations of our minds– the Muller-Lyer illusion – we know the lines are the same length but we cannot see it; the Leeper and Boring psychology experiment; the Monty Hall problem (based on the difficulties of understanding Bayes Theorem); and the Stroop test. Marc went on to illustrate how poor we are at estimating, particularly where there is no point of reference. Typically we underestimate numbers and volumes, and this carries across into the exploration world – and while the common stereotype is that explorationists are over-optimistic in their assessments, the truth is that in the development world, the results are the same – production targets fall far short of pre-development predictions.
AllianceBernstein did a similar study and found the same results. The implications are staggering: companies continually “miss:” their production, capex and earnings targets. Notably, this poor forecasting has a dramatic impact on global supply predictions and oil price.
When faced with a problem, in whatever situation, our tendency is to not consider alternative models and multiple working hypotheses. In fact, research shows that most of us find it difficult to manage and think of multiple working hypotheses, similar to our struggle with multi-tasking. This is our “reflexive” system at work – quick thinking, intuitive, “fight or flight”– which comes more naturally to us, and we have evolved because of it. Where critical thinking is required – our “reflective” thought processes – we need to take much more time and effort to think through a problem.
Marc went on to describe how there are many types of cognitive bias, which affect our thinking in different ways. He concluded that there are four specific biases, Anchoring, Confirmation, Overconfidence and Representative that have a particular impact on our resource and chance estimations, leading us to poor interpretations and decisions.
Anchoring Bias is the tendency to anchor evaluation on a reference value or piece of information/data. As humans, we tend to “anchor” our evaluations on a reference value, and have a reluctance to move away (high or low side) from this anchor.
For example, when considering whether or not to sell a stock we own that has been a poor performer, we focus more on what we paid for the stock and not the current value. However, the market does not “care” what we originally paid. The Leeper and Boring experiment with the “Old/Young” woman was also a good example of anchoring, where different sets of people were given pictures of either young or old women before being shown the picture – those that previewed pictures of older women were far more likely to see the “Old Woman” than the “Young Woman” and vice versa. Marc commented that this bias can be so insidious that even a random, non-relevant value can anchor us (wrongly!). Industry examples of Anchoring would include:
- Resource estimation
- Chance of Success estimation
- Focus on one geologic model or scenario
- Only one seismic interpretation and structure map
- Use of Analogues
- Project Planning and forecasting
- Reliance on expert opinions
Confirmation bias is the tendency to search for, favour or interpret data or information in a way that confirms one’s preconceptions or beliefs. We tend to search for information or data that will confirm the model we have in front of us, dismissing evidence or data that do not confirm our preconceived idea or model. However, seeking to falsify a hypothesis is one of the core elements of the scientific process. Industry Examples of Confirmation Bias would include:
- Data acquisition
- Data interpretation
- Supporting evidence / models
- Non-supporting evidence / models
Overconfidence bias is the tendency to overestimate the accuracy of one’s own interpretation, judgements or ability. For example, various studies have found that 70-80% of drivers surveyed rated themselves as “above average drivers”. Marc considered this bias to be the most common among us, one which can have a substantial negative impact upon our interpretations and decisions.
We find overconfidence bias in many aspects of our business:
- Data interpretation
- Resource estimation (particularly Minimum)
- Chance of Success estimation
- Justification for data interpretation
- Project planning (e.g., costs and time)
- Portfolio outcomes
- Production targets
- Opinion of prospect, opportunity
Representative bias is the tendency to ignore what is generally true and assume the specific or familiar is more probable than the general, as we tend to rely on past experiences and stereotypes. For example, in the book “Thinking, Fast and Slow” (Kahneman 2012) we are confronted with the “Linda problem, in which we are given two answers, and a description of Linda, and asked which of the two answers are most probable:
- Linda is a Bank Teller
- Linda is a Bank Teller and is active in the feminist movement
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable. (a) or (b)?
Surprisingly, Kahneman found that 85 – 90% of the respondents thought (b) was more likely than (a).
Industry Examples of Representative Bias include:
- Interpreting one well outcome as typical of prospect
- Interpretation of “anomalies” and complexity
- Choice of appropriate analogue
- Revision of Resource or Chance estimations given new information or data
- Continuous drilling of dry holes in Play
- Application of statistics (e.g., sample size, conjunction fallacy, base rate, regression to Mean, randomness)
Recent research suggest that our biases are not “faults” in our thinking. Rather, they are tools that have evolved to make quick and efficient judgments and decisions and they have persisted because they generally produce correct results. Marc believes that our biases do not mean we are “bad” or “irrational” thinkers, but biases are natural features of our thinking. However, in the uncertain and complex environment that we work in these biases can lead to poor evaluations and decisions that can erode value.
Cognitive Bias Mitigation
Marc concluded the lecture with ways in which to mitigate cognitive bias, commenting that there is unfortunately no magic bullet to combat them, as they are extremely robust and difficult to mitigate. Much of the research and literature available has concentrated on highlighting and explaining the biases and their impact, with little published on strategies to overcome these biases. However, he suggests the following, predominantly in the areas of awareness and actions:
- Be aware of cognitive biases in yourself and others
- Avoid reducing uncertainty (i.e., range)
- Remain open to new information
- Question and challenge all estimations and supporting justifications
- Employ a structured elicitation process (i.e., synthesis of opinions and knowledge)
- Engage in Disconfirmation – try to falsify your interpretation
- Consider what could go wrong and why something may fail
- Consider alternative scenarios and multiple working hypotheses
- Have in place a team to provide independent assessment and challenge
- Work with multiple anchors
- Pay attention to the base rate, what is generally true
- Become familiar with statistics
The lecture was well-attended, with lots of positive discussion during the reception afterwards. The PESGB would like to thank Rose and Associates for this interesting and informative lecture, and for their sponsorship of the reception.