We all make errors. Let’s not judge those involved without understanding the ‘how’ it made sense.

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

GLOC

Contributor
Scuba Instructor
Messages
120
Reaction score
147
Location
Malmesbury, UK
# of dives
500 - 999
Originally posted on the Human Factors Academy Blog.

“People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems will continue.”
Don Norman. Author, the Design of Everyday Things

When looking at failures we need to understand why it made sense for those involved to behave or operate the way they did. (See post about local rationality). Human Factors and Ergonomics (HF/E) is a recognised science which deals with this and really came to the fore in World War II with two events. The first was the use of checklists to improve aviation safety following the crash of the prototype B-17, the second was looking at the ‘system’ because technologies were advancing faster than humans could adapt and errors were still happening.

In 1943 the U.S. Air Force called in psychologist Alphonse Chapanis to investigate repeated instances of pilots making a certain dangerous and inexplicable error: The pilots of certain models of aircraft would safely touch down and then mistakenly retract the landing gears. The massive aircraft would scrape along the ground, exploding into sparks and flames. Despite numerous recommendations that were put forward for more training and close adherence to checklists, the issues continued. Chapanis interviewed pilots but also carefully studied the cockpits. He noticed that on B-17s, the two levers that controlled the landing gears and flaps were identical and placed next to each other. These levers were like the little dolly light switches below.

oVFVyRmKRZWc6ANlnIh6_arts-&-crafts-copper-light-switch-2.jpg


Normally a pilot would lower the landing gears and then raise the wing flaps, which act as airbrakes and push the plane down onto the wheels. But in the chaos of wartime, the pilot could easily grab the wrong lever and retract the landing gears when he meant to raise the flaps. Chapanis’s solution: attach a small rubber wheel to the landing gear control and a flap-shaped wedge to the flap control. Pilots could immediately feel which lever was the right one, and the problem went away. In this case it was clear that the problem wasn’t the pilot, but the design of the technology that surrounded him. (Top right for gear and flaps)

nKsD1E7GT5iGhj6C0ov8_B-17Panel4.jpg


Considering the above, and diving in particular, the following should be considered true.

  • The best people can make the worst mistake
  • Systems will never be perfect
  • Humans will never be perfect
When diving incidents are discussed, the failure is often attributed to the individual for ‘not following the rules’ or ‘exceeding their certifications’ or ‘just lacking common sense, as it was obvious what was going to happen’. As diving is a recreational sport, there is very little that can be done in terms of punishment (and so it should be), but a more powerful punishment is often dispensed, criticism through social media. These include postings on social media by armchair divers who were not there, could not know the level of knowledge or training of those involved, could not know the motivation and goals of those involved (see Situational Awareness blog), and but still make judgements about ‘how stupid could they be’. Whilst the majority of diving is outside of formal rules, some are still present like in the UK when operating at a commercial site and a DCS event occurs, reporting to the site staff. The primary reason is to provide immediate first aid and evacuation if needed. If there is fear of an investigation by the HSE (or equivalent) and no reporting occurs, no-one learns from the event and more importantly, treatment can be delayed with potentially serious consequences. From a survey I conducted 2 years ago, more than half of the instructors who had responded had not reported DCS - the reasons for not reporting were not captured. Unfortunately under-reporting is a major issue across many industries which means the scale of the issues are rarely known.

The fear of reporting, either in a constructive manner on social media, or reporting through official channels when required, is indicative of the lack of a Just Culture.

Just Culture

So what is a Just Culture? it is one that is open, fair, and a learning culture, and combined with the design of safe systems and managing behavioural choices, it creates an effective safety culture. The majority of research into Just Culture has been conducted in formal & established environments and so the framework is normally about punishments framed around legal or HR requirements/rules. The majority of diving is not like this, but learning across the community is still essential as there is no way in which divers can be taught everything on a training course, and some of the lessons that could be taught in the real world could end up with someone dead!

One of the myths of the Just Culture is that it is blame free. This is not the case. If reckless behaviours have been undertaken when there is a duty of care, then it needs to be reported in the interest of safety. When outside of the duty of care construct, then risk is managed at an individual level, but risk perception and acceptance are individually referenced, therefore the feedback system is different. Not reporting your error, preventing the system from learning is the greatest problem of all and some actions do warrant disciplinary or enforcement action (when operating in an environment with a duty of care, be that voluntary or commercial), or some means of correcting what has happened so it doesn’t happen again in other environments.

The key question is; where do you draw the disciplinary line for those with a duty of care, or negative criticism when outside of a duty of care? In order to know that, we all need to understand the differences between human error, risky behaviour and recklessness.

The big three to consider are:

  1. Human error: inadvertent action; inadvertently doing other that what should have been done; slip, lapse, mistake
  2. Risky behaviour: choices that increase risk, where risk is not recognised or is mistakenly believed to be justified – includes violations and negligence
  3. Reckless behaviour: behavioural choice, intentional acts, conscious disregard to a substantial and unjustifiable risk
The diagram below is from the UK Military Aviation Authority (MAA) and shows the spectrum of options based around a 'duty of care' construct. However, for it to work effectively, the person(s) doing the assessment need to be impartial and have no gain in any outcome of the decision making process. Such a model does not exist for recreational activities but the concepts are still valid.



The next blog will look at these subjects in more detail and ways in which we can address the problems we face. These include learning from incidents, accidents and failure, and creating the environment whereby divers can talk about failure. If reporting isn’t happening at a commercial level, it is likely that there are bigger issues to resolve than just the immediate fallout from the current event such as why real events like DCS are not being reported?

Until then, consider that whilst we don’t have a formal disciplinary framework for the majority of diving, we do have social judgment and you are part of that. If you see something adverse happening, hard as it may be, don’t make a judgement of why they could have made such a silly mistake, rather consider how they determined that it was a ‘safe’ decision. That decision will be based on (in)experience, motivations, previous outcomes from similar situations and training which are all likely to be different to yours.

Learn from your mistakes, better still, learn from others’



Footnote:

Human Factors Skills in Diving classes teaches us, in a safe space, to learn from others, to improve our own safety methods and attitudes, it provides food for thought on how to provide a better, easier learning environment for own students and allows us to be real human beings. There is intentionally lots of failure to learn from on the course.

More information on Human Factors Skills in Diving classes can be found at www.humanfactors.academy

Upcoming classroom-based course dates are here Training Dates

Online micro-class (9 modules of approximately 15 mins each) details are here Human Factors Skills in Diving Online Mini-Course
 
Very nice article, thanks! Another aspect I have noticed in aviation that transfers well here is the tendency to identify a human error in an incident / accident and then breathe a sigh of relief, "I don't do that so I'm fine".

I suspect that we all realize in a sub-conscious way that there is an inherent danger in what we do, regardless of how proficient we may be. So, when someone gets hurt, there is a fear reaction that takes place. If we have an error to point out, then we can deflect that fear. This happens even when the error in question is one that the person was "forced" into by systemic or procedural issues.

The nett result is that once we have a "mistake" to point to, we often stop looking any further to see whether we are in fact at risk of being forced into a similar "mistake" due to our diving practices or habits or training.

BTW, Darryl says "Hi"
 
Please see the Mishap analysis & "Blamestorming" sticky at the head of the A&I forum...
Rick Murchison:
Accident analysis is the business of identifying mishap causes and recommending actions to prevent a repeat mishap.
Who's to blame doesn't matter.
The laying of blame, extraction of justice, punishment, liability, etc - all these are the business of the courts (and to satisfy our inner need for balance and justice in the universe), but they don't really address mishap prevention. Mishap prevention involves actions.
Example:
Lets say the causes of a mishap are all actions taken by a boat's captain.
Mishap analysis would identify those actions as causes, and recommend other actions that would prevent (or greatly reduce the chance of) the mishap in the future. Nothing about liability, fault, blame, punishment etc would be addressed in the mishap analysis because those are not actions that would prevent the mishap.
Example: "The boat didn't have enough fuel on board to conduct a search." might be identified as a cause of a mishap, and mishap analysis would recommend "that a boat always carry enough fuel to conduct a search" on every dive. Why the boat didn't have enough fuel, who made the decision to carry too little fuel, whose fault it is that the boat had too little fuel, etc, are all questions for regulators and courts, not mishap analysis. And it may be that there's no blame anyway - it could be that a new standard needs to be set because this mishap revealed a flaw in the current standard.
The general theme of mishap analysis is that all mishaps are preventable, even when no one is at fault. For example, all diving mishaps are preventable by not diving in the first place.
Mishap analysis doesn't waste time asking "what was he thinking?" either, but rather asks "what did he do?" We can agonize all day long about why Joe didn't ditch his weights when ditching his weights would have saved him, but it doesn't really matter. The action that will prevent a repeat of Joe's mishap is "ditch weights."
--
What we're trying to do in the A&I forum is to provide a forum for a Safety type analysis of mishaps; to identify actions that lead to mishaps and actions that can prevent mishaps. The mishap analysis mindset is difficult for those who lack formal training in it, as our natural tendencies are to find out who or what to blame and seek justice.
Just remember that justice isn't going to prevent future mishaps. It is changes in behavior that prevents recurrence of mishaps.
Rick
 
Rick, a fair comment but not sure if the comment to see the sticky was positive or critical.

Because the majority of diving takes places outside a formal regulatory framework, 'acceptable' risk is in the eyes of those involved rather than a judge. You could also argue that a lawsuit isn't about right or wrong, it is about who has the strongest argument or evidence.

As the article highlights, judgement doesn't have to be a formal outcome e.g. litigation or courts, but can also take the form of social judgment via social media. You only have to read a number of the social media posts once an adverse event has happened, to see that blaming is happening. As you highlight, it is natural to look for someone to blame. Society in general looks at external attribution for the error rather than looking internally and examining what 'I' did that led to the incident.

Whilst the argument all mishaps are preventable by not partaking in the activity, the current research shows that not all incidents are preventable once the activity has been decided to be undertaken. Many incidents are not linear in nature, plus both safety and risks materialising are emergent, which means they are dependent on the environment in which the events take place, and the issue cannot be reconstituted from its parts. The good thing is that humans are pretty resilient and can spot some of the issues materialising but only because they have had the experiences in the past. If you don't have the experience, you are unlikely to be able to spot the emergent events.

The next few articles will go through human error, risky behaviour and violations, what they mean and how to deal with them.

Finally, justice can be:

- Retributive, where you (society) is looking for someone to pay for the damages caused which doesn't necessarily improve the situation, rather drives sub-optimal behaviours underground (fear of getting caught, rather than wanting to do the right thing).
- Restorative, where those involved tell their stories about why it made sense for them to do what they did at the time given the pressures they faced, goals to achieve, skills and knowledge they had. This is more likely to identify latent issues that can be resolved. It is also more likely to identify organisational or supervisory issues rather than 'sharp end' issues of those involved.

This short series from Sidney Dekker is really worth watching if you are interested in Just Culture.
 
Last edited:
https://www.shearwater.com/products/swift/

Back
Top Bottom