One of the most powerful psychological effects is the feeling of social conformance. As humans we are wired to be part of a group, we want to be associated with others who are similar in fashion, outlook, activity, training or just enjoying the same sport. Diving, in the main, is a social activity where we create friendships and want to help others learn, develop and have fun.
But...when we have an incident, the community doesn’t always help. You only have to look at the social media for the external judgments that take place, especially if a serious incident has happened or someone has been killed. In fact, it is normally only when someone dies do we hear about the event because it is not possible to hide the fact that someone has died. In addition, because there is a lack of detail, the commentary starts by judging that what they did was ’stupid’ and it should have been obvious that it would end this way. This lack of context and detail is a real challenge when it comes to improving diving safety. [See Learning Teams Blog]
However, when we look a little deeper, we can see there a few biases at play which lead us to make these negative judgements of others.
There are two key biases at play, the first is hindsight bias which is covered in another blog, the other is outcome bias. In essence, outcome bias is where we judge more critically when the outcome is severe than when it isn’t even if the process leading to the outcome was the same. The following two examples illustrate this.
A diver who dives independent twins and forgets to swap his regulators on a regular basis, exists the 30m (100ft) dive with 30 bar (~450psi) in one side, and 210 (~3200psi) bar in the other. He jokes that he had a lapse during the dive and forgot to check/swap his regulators. Another diver in a similar situation on another 30m (100ft) dive has his buddy come to him OOA near the end of the planned dive and he donates the working regulator in his mouth to the buddy who empties the cylinder quickly. Now they have to conduct an air share on the ascent and the OOA buddy panics and has a rapid ascent leading to severe DCS. Honestly, consider how you judged each of the dives in question.
Which one is judged more severely in your own mind? Now consider a slight variation, what if the second diver with twin independents was the same as the first and they hadn’t learned from the first dive? Would you judge them more severely? What if the OOA diver had died on the second dive? Would that have made the judgement even more severe? If you did, you are normal. If you didn’t, you have thought quite a bit about this.
Two full-cave cave divers dived in a system and approximately 2/3 the way to their turn-pressure, they got distracted and miss a point where the line changed direction, they swam over a gap and picked up a new line by accident. They did not see the gap. They continued to swim for another 4-5 minutes and the 2nd diver recognised that they are not in the cave that they should be having been in the system 3 years previously. The lead diver had never been there before and so didn’t know better. The following diver signalled that something wasn’t right, they turned around and found the jump they had swum over, missed, and then rejoined their original plan. Another pair of divers were in the same cave, and they too swam over the jump, but didn’t notice they were in the wrong cave and swam to the full extent of their gas plan. During the exit, a collapse happened elsewhere in the cave system causing a silt out, necessitating a blind-diver exit. Shortly after this, a failure of the manifold meant that a gas share was required. They reached the end of the line where the jump was and they drowned as they couldn’t find the exit line and run out of gas.
How would you judge the jumping of the line in the first case compared to the second?
In both cases, what if the diver was well known, a ‘star’ and a well respected member of the diving community? What about others involved? The skipper? The buddies? The instructor who trained them? We often look for someone to blame, rather than examining the system in which we as divers undertake our sport.
Research in other domains has shown that decision making is where the majority of accidents have their genesis. To improve diving safety, we need to understand the decision making process of those involved and why it made sense to make the decisions they did. Peer pressure, social conformance, time pressures, goal fixation…the list goes on. The diagram below from Amalberti et al, shows how we often move to more risky behaviours because of the pressures driving from the right-hand side of the diagram. Whilst this is based on healthcare, the concepts are applicable in many other domains including diving.
Violations and migrations in health care: a framework for understanding and management. Amalberti, 2006
The problem we have with diving fatalities is that the decision maker is normally dead which is why, in my opinion, fatalities are a really poor way to learn compared to non-fatal incidents. This opinion is based on the fact that we do not have a robust and independent accident investigation system with clear definitions of contributory factors associated with human factors. Furthermore, many fatalities are the subject of litigation where the aim is to find out who is at fault and not how we can learn. As a couple of examples of this, one training agency has at the top of their incident reporting form ‘This form is being prepared in the event of litigation’ which is unlikely to describe failures of the ‘system’, and at DEMA last year, one of the insurance underwriters stated ‘As long as your paperwork is complete, then you will be safe.’ inferring safe from litigation rather than operational safety. Neither of these attitudes help learning.
Therefore, in my opinion, non-fatal incidents are much more effective to learn from, but that requires a psychologically safe environment to be effective. A psychological safe environment is one where we will not be humiliated or made to feel less good about ourselves if we speak up about concerns we have, or if we make a mistake, or if we point out a mistake. Very rarely do people make decisions that will intentionally harm others or themselves, therefore it is essential to understand what shaped their decision making processes. A "Just Culture" is another way of describing a psychologically safe environment. When a ‘safe’ environment is in place, people are more likely to report their errors or mistakes so that others can learn. Within small communities, this can be made to work because there is a level of trust but this still needs to be actively managed in case divers fall back into old habits. What would be great is if the community could grow up and stop throwing rocks at others for being ‘human’ and making mistakes.
One of the key outputs from detailed narratives of non-fatal incidents is the stories which can be retold during training or coaching - in effect giving examples of ‘real world’ diving incidents/accidents and why drills or protocols are in place. By teaching divers the ‘why’ behind the ‘rules’ it allows them to problem solve more effectively when they encounter a situation which doesn’t match their training or ‘rules’. Consequently, they are now able to refer to the mental models which have been created which leads to better decision making. This is why experts make better decisions that beginners or inexperienced divers - they have more models to relate to. This simplistic model from Gary Klein about Recognition Primed Decision Making explains this process and shows why training should not just be about skills taught in a rote manner, but the ‘why’ something is done.
Recognition Primed Decision-making Model. Streetlights & Shadows. Klein, 2009.