17 Cognitive Biases which Contribute to Diving Accidents

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

GLOC

Contributor
Scuba Instructor
Messages
120
Reaction score
147
Location
Malmesbury, UK
# of dives
500 - 999
Introduction

Humans are subject to a variety of cognitive biases and heuristics (mental shortcuts). These directly impact the decision-making process of the diver, sometimes resulting in incorrect judgments when certain circumstances dictate. In most settings this can be relatively harmless, however, for those operating in high-risk environments, the consequences of an incorrect decision can be critical or deadly, especially if there is a short period of time between error detection and the ability to recover the error. The bias itself impacts the divers' perception of reality, changing their understanding of the situation, and filtering the true nature of the events as they unfold. These cognitive issues are further compounded by physiological factors in diving like narcosis, reduced colour perception and visibility, and the changes in sound transmission when underwater.

The effects on human perception

Human perception is a “conscious sensory experience” that utilises a combination of our senses and our brain to filter and read those sensory inputs. Research has revealed that there are a number of common ways that our brain’s perception is modified. While these biases serve as filters, hindering our ability to make accurate decisions, they are also considered essential to deal with the massive amounts of information which we have to deal with in short periods of time. This blog covers this reduction process in more detail. The problem we face in real time in high-risk scenarios is that often we are unaware of this filtering and reality modification process.

tHsonCyNTEGYPz0YQyOi_HfL7wlUkTcyrYRZjCrWs_fullsizeoutput_14c7.jpe


Types of cognitive bias

There are many types of cognitive bias that can influence divers’ safety because they impact risk perception and acceptance. The following are some examples of biases that can be particularly dangerous to divers:

Ambiguity effect

An aspect of decision theory where a person is more likely to select something that has an intuitively clear risk as opposed to one that seems relatively less certain. This could lead someone to choose a more risky option, albeit one with a more certain risk. For example, a CCR stays on the loop when there is a fault in the rebreather compared to bailing out and making an ascent on open circuit.

Anchoring bias

A bias where people will make decisions based on a provided data point, for example, if given a baseline of a certain amount of gas as a requirement or a depth for the dive, this number will be utilised in determining requirements regardless of whether operational needs might actually require much more, or much less. This might be 'surface with 50 bar/500 psi' but there is no understanding what this number means in terms of cylinder size, depth or breathing rate.

H5HWAjhSTCru4IyfayZX_P1050072.jpg


Attentional bias

Humans pay more attention to things that have an emotional aspect to them. In diving, this could lead to a person making a decision based on a perceived problem due to a past experience. For example, if a diver has had a DCI event or someone close to them has, they might ignore the risk of running low on gas in an attempt to avoid DCI. An incident was recounted to me about a diver who ran out of deco gas because they didn't understand the mechanism by which the 6m stop was managed on their SUUNTO and was afraid of getting bent, despite being close to 6m for a long time.

Attentional tunnelling

This has been defined as “the allocation of attention to a particular channel of information, diagnostic hypothesis, or task goal, for a duration that is longer than optimal, given the expected cost of neglecting events on other channels, failing to consider other hypotheses, or failing to perform other tasks”. This can be more simply explained as the ‘7 +/- 2 lightbulbs’ of mental capacity. If a number of those lightbulbs are taken up with basic tasks, the capacity to monitor other tasks is limited. This is despite the risks of not completing those 'apparently' secondary tasks. An example of this would be shooting UW video and not monitoring pO2, as in the case of Wes Skiles. (The incident was much more complex than I have just explained but attentional tunnelling was a major contributory factor).

Automaticity

While not a bias, this refers to the fact that humans who perform tasks repeatedly will eventually learn to perform them automatically - so-called muscle memory. While generally a positive attribute, this can lead to a person automatically performing a function (such as a checklist item) without actually being cognisant of the task itself. Expectation bias can lead them to assume that the item is correctly configured even if it is not.

Availability heuristic

This describes how people will over-estimate the likelihood of an event based upon the emotional influence the event may have had, or how much personal experience a person may have had with that type of event. This can lead to incorrect assessments of risk, with some events being attributed more risk than they should, and others not enough. An example might be how much focus is placed on DCI (a pretty rare event) compared to running low or out of gas, which is much more common.

Availability cascade

This is a process where something repeated over and over will become to be accepted as a fact. An example of this is the misconception that diving nitrox extends your bottom time AND makes diving safer on the same dive. Rather, minimum decompression times can be extended given the same level of risk of DCI, or the risk of DCI can be reduced if the minimum decompression time for air at the same depth is used. e.g. the same level of decompression requirement exists for 32% nitrox at 30m and 30mins or if using air, 30m and 20mins, but you cannot be safer and have the longest bottom time.

More... 1/3
 
Base rate fallacy

Historically, the lack of data has resulted in a diver lacking the ability to see large statistical trends. When a person focuses on specific events (which might be a non-event), rather than look at the probability over the entire set (ignoring the base rate) there is a tendency to base judgments on specifics, ignoring general statistical information. This can affect divers if they are not able to accurately assess the risk of certain decisions. Examples in diving could be cardiac risk, or out-of-gas situations. A lack of population non-fatality data makes this bias even more pronounced.

Confirmation bias

This describes a situation where a person will ignore facts or information that does not conform to their preconceived mental model and will assume as true any information that does conform to their beliefs. This is very dangerous in diving where a diver might form an incorrect mental model of their situation and have a very difficult time changing that view even in the face of new information. An example of this would be a diver who is convinced that their CCR is functioning correctly, despite more and more warnings to the contrary. If linked with alarm blindness (where alarms are often present), this can be a critical problem.

Expectation bias

This might be considered a subset of confirmation bias, but describes a situation where a person sees the results they expect to see. Due to the (un)reliability of some electronics, when alarms occur and they ‘always happen’, then false positives occur and the diver expects the alarm to be telling a falsehood. Unfortunately, they have a limited ability to cross-check other than bailing out which normally means the end of the dive.

6BH6jFgxSHeEUcFHlR0p_DSC_6943.jpg


Optimism bias

As the name suggests, this is a situation where people are overly optimistic about outcomes. It is a common issue in diving, as divers have seen so many bad situations turn out “okay” that the sense of urgency and risk can be reduced when such reduction is not warranted.

Outcome bias

This is the tendency to take outcomes into account when they are irrelevant to the decisions involved (Baron and Hershey, 1988). An example in diving would be comparing an incident whereby a cave diver completed a blind-jump and ended the dive successfully and one where the blind-jump lead to the death of a diver. Outcome bias can contribute to normalisation of deviance.

Overconfidence effect

As the name suggests, there is a strong tendency for people to overestimate their own abilities or the qualities of their own judgments. The Dunning-Kruger effect is a great example of this. This can have fairly obvious implications in diving. Overconfidence is often developed because of the positive reinforcement techniques used in training and the want/need to pass people without having ‘mastery’ of the skills needed.

Plan Continuation / Sunk Cost Fallacy

This might be considered a subset of confirmation bias. There is a strong tendency to continue to pursue the same course of action once a plan has been made, but it may also be influenced by some of the same issues that lead to “sunk cost effect”, where there is a “greater tendency to continue an endeavour once an investment in money, effort, or time has been made”, (Arkes andBlumer, 1985). Plan continuation bias is the basis of the case study used in the online micro-class in which a diver ran out of gas at 60m and bolted to the surface, which led to his death.

Prospective Memory

A common situation where one needs to remember to do something that will occur in the future. It can be particularly challenging when faced with distractions of any sort, e.g., a person is driving home from work and needs to stop to pick up milk en-route. If that person then gets a phone call, there is a high probability that they will forget the task of stopping at the store. In diving, an example would be an instructor assembling their rebreather and they get interrupted by a student, leading to a critical step being missed when they go back to their unit to carry on the assembly.

fYL8QWC5QKKkgnQNpDYA_20150715-PAS_0167.jpg


Selective perception

There is a strong bias to view events through the lens of our belief system. This is different than expectation bias in that it is generally applied to our perception of information as filtered by our belief system itself, while expectation bias is more generally utilised describing situational awareness based on things we expect to happen. Selective perception can lead to an incorrect hypothesis, such as a belief that a DCI event was based on a specific profile rather than other factors.

More... 2/3
 
Conclusion

While published research into the role of cognitive biases in diving incidents and accidents is not readily available, there would have to be something very different about diving if human behaviour in this environment was to be different to aviation, nuclear, rail, healthcare, and oil & gas.

Unfortunately, research has also shown that most people are unaware of these biases and how they impact human performance when certain latent issues converge to create the grounds for an accident or near miss.

One way in which diving safety could be improved would be to ensure that these topics are part of an accident or incident investigation process rather than being captured under the general topic of ‘human/diver error’.

[Note: how many people counted the biases? There were only 16 listed. Expectation bias at play.]

Footnote:

The Human Factors Academy provides globally-unique classes to improve human performance and reduce the likelihood of human error occurring in the diving domain. We operate in other domains too, like Oil and Gas and Healthcare

Online micro-class (9 modules for approximately 15 mins each). Human Factors Skills in Diving Micro-class

Webinar-based programmes starting 22 Jan 2018
Human Factors Skills in Diving - Webinars

Upcoming classroom-based course dates are here Training Course Dates

Facebook Group
The Fallible Diver: Human Factors in Diving
 

Back
Top Bottom