Why did he make such an obvious mistake...?

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

GLOC

Contributor
Scuba Instructor
Messages
120
Reaction score
147
Location
Malmesbury, UK
# of dives
500 - 999
BshLtCfKRe2K5hQX43GJ_Screenshot_2017-01-22_07.56.55.png


Recently there have been a number of high profile incidents in diving in which the decisions made appear incredulous given the outcome. Using a phrase from Todd Conklin, a US-based high performance team developer and safety advocate, the outcomes whilst sad in many cases are not the most interesting part, it is the decision making process which led to them which is interesting. You could say that they are unsurprising or normal. In fact, in 1984 Chris Perrow wrote a book entitled ‘Normal Accidents’ in which he describes the facts that most of accidents in high risk operations are normal, in that we know they could happen, we just don’t know where and when, and given our limited personal and organisational capacity to process massive amounts of information, it is no surprise that they slip by our attention and we make flawed decisions. It is this combination of, often discrete decisions which leads to disaster.

That is what this blog post is about, decision making and why, in hindsight, we often view an adverse event with clarity compared to those who were involved at the time and why we are unable to rationalise their thought processes which lead to their demise.

I am not biased! Yes you are! We all are...

In the first part we will look at the key biases we are exposed to when it comes to examining adverse events. The first of these is hindsight bias. This is where we make the assumption that we knew what the outcome would have been if we had been placed in that same position, with the same information, training and environment/physical cues. It is incredibly difficult to isolate this bias and remove it from our thought process, other than by asking the simple question “Why did it make sense to those involved to behave in the manner they did?”

An example of this might be the close-call which the National Park Service Submerged Resource Centre team had 3 years ago when their chief scientist had an oxygen toxicity-induced seizure at 33m/110ft on their VR Technology Sentinel and was recovered by the buddy and dive team. The technical reason for the event was that two of the three oxygen cells were current limited and due to the voting logic within the system, the controller drove the pO2 in the loop up to more than 3.00. The reason they were current limited was because of their age and should have been replaced - they were 33 months old. You might ask why weren’t they replaced in a timely manner? The following provides a brief explanation in terms of decision making.

The team had been diving a mixture of AP Inspirations and VR Sentinels. The AP units had always had their cells replaced by AP when they went back for a service. The Sentinels when they went back for their service had come back with a test sheet saying the cells were operating at 100% to spec. The assumption on the part of the team was that the cells had been changed based on their previous experience with AP units; this in fact was not the case although they did have a note included in the units to say the cells should be changed every 18 months. In addition, the service had happened four months before the event so there is an ongoing assumption that the unit is good for another 8 months or so especially as the unit was not dived in the first 3 of those 4 months. Further, the day before the unit had performed flawlessly with no current limiting. In hindsight(!!), the team should have checked the cell dates before each dive, or at least during each unit assembly but they didn’t because their mental model of the world was different to reality. (This shortfall has been addressed since the event. In addition, there were many other issues which contributed to the event but have not been covered due to brevity of this article.)

This brings us nicely onto the next bias - outcome bias - and how we often judge the severity of the decision-making process based on the outcome we perceive. As another example, reflect on the following. A team of two full cave certified divers were travelling into a cave system where No 2 had been before but the lead diver hadn’t. After approximately 30 mins they passed through a cenote with amazing shafts of light shining down which took their attention away from the line whilst they kept swimming. After another 5 mins or so, the second diver started to get an uneasy feeling about where they were because the cave didn’t look right. He stopped, signalled the lead that something wasn’t right, turned around, swam back up the line and found the jump which they had swum over without realising it because their attention drawn to the shafts of light. Coincidentally, they had encountered another line just beyond theirs which because of their brief distraction, they followed because they had missed the jump. If the second diver had not been in that cave before, he would not have had that uneasy feeling and they would have carried on until something was really obviously wrong from the plan/brief or when they reached their gas turn pressures. Now consider what the narrative would have read if they had had an emergency which necessitated a gas-sharing blind exit and they reached the terminal point of the line which they were on and then had to conduct a ‘blind jump’ which they were unable to do, and as a consequence, drowned? The narrative would likely say “they conducted a jump without a line in place, how stupid, they broke a golden rule of cave diving!” In the first case it is also easy to say “they should have paid more attention” but the use of should have or could have does not help prevent future adverse events. How severe (and how prevalent) are such un-noticed blind jumps? After a quick chat with colleagues here in Helsinki at Techu2017, not that uncommon…

Decision making is complicated!

So if we have these major biases impacting our view of decision-making after the event, what informs us of our decisions during the activity.

It is complicated! It involves situational awareness, ‘decision making’ and a feedback loop to inform future decision-making strategies. It also involves the goals and objectives we are aiming for as this tunnels our view of the world, we primarily look for things which confirm our predicted outcomes. It involves our long term memories and the experiences we have built up through ‘life’, training and previous dives. It involves the equipment we are using and the environment we are in, and whether those are improving or removing our mental capacity. The diagram showing Endsley’s SA Model below shows the linkages in this decision-making and situational awareness. This is why SA is not just about what is going on around you right now, it is how we perceive and process the information and then an ability to project it into the future.

sT1sIXNTb6fWjYXGxV1j_Screenshot_2017-01-22_07.44.45.png


However, to simplify we have three key ways of making decisions: skills-based, rules-based and knowledge-based. The terms skill, rule and knowledge based information processing refer to the degree of conscious control exercised by the individual over his or her activities.

In the image below, knowledge-based refers to an almost completely conscious manner. In the case of a beginner diver, learning to put up a dSMB for the first time, they are solving the problem as they are going along, learning the steps which make up the skill. In the case of an experienced diver, this might be a novel situation such as being in a wreck and a small collapse happens and they have to problem solve dynamically. The skill based mode refers to the smooth execution of highly practiced, largely physical actions in which there is virtually no conscious monitoring.

Skill based responses are generally initiated by some specific event, e.g. the requirement to dump gas from the BCD as the diver ascends. The highly practiced operation of dumping the gas will then be executed largely without conscious thought. In highly skilled individuals, the error rates are in the order of 1:10 000. However, when we move to knowledge-based decision making where we have to use previous experiences to solve the problem and come up with a viable solution, then it drops to 1:2 - 1:10!

70iHhoUxSKmljlK9QiHK_Screenshot_2017-01-22_07.13.33.png
 
In the following image, another category of information processing that has been identified involves the use of rules. These rules may have been learned as a result of undertaking diver training or by diving with more experienced divers. The level of conscious control is intermediate between that of the knowledge and skill based modes.

FeliEgVDTmW4EdkX3I6h_Screenshot_2017-01-22_07.13.45.png


Often what is missing in the real world is a feedback loop to help us determine whether the decisions we have made are the good ones or could have been better. Just because the decision (dive) didn’t end up with an injury or death, it doesn’t mean it was the best or ‘safest’ way of doing it (Safety is a relative concept considering the risk of any dive is death).

Experts more often make better decisions because they have more experiences to refer to (bottom of Endsley’s model). To create this knowledge they need to have practiced and failed and learned why something is done a certain way and why the failure happened. Novices when faced with the same situation will try to refer to their long term memory stores and see what matches as close to what is perceived. Sometimes all they have is a best guess, which isn’t the best way of solving such problems - but it might be the only way. Knowlegde is created by understanding the why, not just repeating the 'how'.

Solutions

The paradox is that as diving becomes safer and equipment doesn’t fail as often, the evidence to show there is a problem diminishes. This doesn’t mean that training should necessarily be simplified because the risk of a system failing is still there. The problem is that as we simplify the training system and reduce the amount of ’stress’ training delivered to divers during training, we are creating a system whereby when the failure does happen, the ’skills’ aren't there to execute it, and the knowledge isn’t there either. The issue is further compounded because instructors are now in the situation where they have limited experience of failures and the need to problem solve in real time given ambiguous circumstances, and therefore they are unable to teach their students what they have personally learned. This is especially true of rebreather diving.

To combat this there is an attitudinal change required. Diving can be safe, but at the same time there needs to be a true recognition of all involved, from OW student dive 1 and above, that there is a risk of death on every dive. It is often the known unknowns that will pose a problem and we need knowledge-based decision making skills to prevent that event from ending in disaster. That means we need to build contextual and relevant experience and not ‘rules-based' decision-making processes.

Finally, when we read about incidents or fatalities, we need to recognise that in the majority of cases it was likely that the situation they faced or the pre-cursors to it, exceeded their decision-making processes, either through lack of knowledge to problem solve, the lack of an understanding of the relevance of the 'rules' or because they just didn't have the skills needed. The fact it ended up with someone scared, injured or dead is just an outcome. We need to focus on the activities and decisions prior to the event (not immediately prior either!) and fix that decision-making process if we are to make diving safer and reduce unneccesary risk. Remember hindsight bias is incredibly powerful, so ask yourself, why did it make sense to them, given their experience, mindset, training, emotional state and so on. Do not use your experience to make their decisions - you know the certain outcome they didn't not!

Figures 1 & 2 from Embrey D, Lane H. Understanding Human Behaviour and Error The Skill , Rule and Knowledge Based Classification. 1990.
Endsley's Model from Endsley MR. Toward a theory of situation awareness in dynamic systems. 1995

Footnote:

The Human Factors Academy provides two classes to improve human performance and reduce the likelihood of human error of occuring. The online class provides a comprehensive grounding of Human Factors giving you the basic skills need to improve human performance and reduce errror, whereas the classroom-based class is very comprehensive and intense with plenty of opportunity to learn from failure and error, and providing an opportunity to be reflective on behaviours and performance. You can see more about them, including testimonials at www.humanfactors.academy
 
Having taken the Human Factors in Diving course, I simply cannot recommend it enough for all divers
 
I read a book called Deep Survival by Laurence Gonzales and several chapters deal with human decision making and how errors are built into the system.
I was recommended this book and makes for a great read on the subject.
 
I have read it twice. It has applications well beyond diving. Without being fatalistic, in essence it states that failures are to be expected.
 
Interesting. $4 is too much for a book?

I'd hate to think what you think of the costs of the training which is linked at the bottom of my post!
Woosh!

Gareth, surely his sarcasm at least brushed your hair a little. :wink:
 

Back
Top Bottom