Originally posted on the Human Factors Academy Blog.
“People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems will continue.”
Don Norman. Author, the Design of Everyday Things
When looking at failures we need to understand why it made sense for those involved to behave or operate the way they did. (See post about local rationality). Human Factors and Ergonomics (HF/E) is a recognised science which deals with this and really came to the fore in World War II with two events. The first was the use of checklists to improve aviation safety following the crash of the prototype B-17, the second was looking at the ‘system’ because technologies were advancing faster than humans could adapt and errors were still happening.
In 1943 the U.S. Air Force called in psychologist Alphonse Chapanis to investigate repeated instances of pilots making a certain dangerous and inexplicable error: The pilots of certain models of aircraft would safely touch down and then mistakenly retract the landing gears. The massive aircraft would scrape along the ground, exploding into sparks and flames. Despite numerous recommendations that were put forward for more training and close adherence to checklists, the issues continued. Chapanis interviewed pilots but also carefully studied the cockpits. He noticed that on B-17s, the two levers that controlled the landing gears and flaps were identical and placed next to each other. These levers were like the little dolly light switches below.
Normally a pilot would lower the landing gears and then raise the wing flaps, which act as airbrakes and push the plane down onto the wheels. But in the chaos of wartime, the pilot could easily grab the wrong lever and retract the landing gears when he meant to raise the flaps. Chapanis’s solution: attach a small rubber wheel to the landing gear control and a flap-shaped wedge to the flap control. Pilots could immediately feel which lever was the right one, and the problem went away. In this case it was clear that the problem wasn’t the pilot, but the design of the technology that surrounded him. (Top right for gear and flaps)
Considering the above, and diving in particular, the following should be considered true.
The fear of reporting, either in a constructive manner on social media, or reporting through official channels when required, is indicative of the lack of a Just Culture.
Just Culture
So what is a Just Culture? it is one that is open, fair, and a learning culture, and combined with the design of safe systems and managing behavioural choices, it creates an effective safety culture. The majority of research into Just Culture has been conducted in formal & established environments and so the framework is normally about punishments framed around legal or HR requirements/rules. The majority of diving is not like this, but learning across the community is still essential as there is no way in which divers can be taught everything on a training course, and some of the lessons that could be taught in the real world could end up with someone dead!
One of the myths of the Just Culture is that it is blame free. This is not the case. If reckless behaviours have been undertaken when there is a duty of care, then it needs to be reported in the interest of safety. When outside of the duty of care construct, then risk is managed at an individual level, but risk perception and acceptance are individually referenced, therefore the feedback system is different. Not reporting your error, preventing the system from learning is the greatest problem of all and some actions do warrant disciplinary or enforcement action (when operating in an environment with a duty of care, be that voluntary or commercial), or some means of correcting what has happened so it doesn’t happen again in other environments.
The key question is; where do you draw the disciplinary line for those with a duty of care, or negative criticism when outside of a duty of care? In order to know that, we all need to understand the differences between human error, risky behaviour and recklessness.
The big three to consider are:
The next blog will look at these subjects in more detail and ways in which we can address the problems we face. These include learning from incidents, accidents and failure, and creating the environment whereby divers can talk about failure. If reporting isn’t happening at a commercial level, it is likely that there are bigger issues to resolve than just the immediate fallout from the current event such as why real events like DCS are not being reported?
Until then, consider that whilst we don’t have a formal disciplinary framework for the majority of diving, we do have social judgment and you are part of that. If you see something adverse happening, hard as it may be, don’t make a judgement of why they could have made such a silly mistake, rather consider how they determined that it was a ‘safe’ decision. That decision will be based on (in)experience, motivations, previous outcomes from similar situations and training which are all likely to be different to yours.
Learn from your mistakes, better still, learn from others’
Footnote:
Human Factors Skills in Diving classes teaches us, in a safe space, to learn from others, to improve our own safety methods and attitudes, it provides food for thought on how to provide a better, easier learning environment for own students and allows us to be real human beings. There is intentionally lots of failure to learn from on the course.
More information on Human Factors Skills in Diving classes can be found at www.humanfactors.academy
Upcoming classroom-based course dates are here Training Dates
Online micro-class (9 modules of approximately 15 mins each) details are here Human Factors Skills in Diving Online Mini-Course
“People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems will continue.”
Don Norman. Author, the Design of Everyday Things
When looking at failures we need to understand why it made sense for those involved to behave or operate the way they did. (See post about local rationality). Human Factors and Ergonomics (HF/E) is a recognised science which deals with this and really came to the fore in World War II with two events. The first was the use of checklists to improve aviation safety following the crash of the prototype B-17, the second was looking at the ‘system’ because technologies were advancing faster than humans could adapt and errors were still happening.
In 1943 the U.S. Air Force called in psychologist Alphonse Chapanis to investigate repeated instances of pilots making a certain dangerous and inexplicable error: The pilots of certain models of aircraft would safely touch down and then mistakenly retract the landing gears. The massive aircraft would scrape along the ground, exploding into sparks and flames. Despite numerous recommendations that were put forward for more training and close adherence to checklists, the issues continued. Chapanis interviewed pilots but also carefully studied the cockpits. He noticed that on B-17s, the two levers that controlled the landing gears and flaps were identical and placed next to each other. These levers were like the little dolly light switches below.
Normally a pilot would lower the landing gears and then raise the wing flaps, which act as airbrakes and push the plane down onto the wheels. But in the chaos of wartime, the pilot could easily grab the wrong lever and retract the landing gears when he meant to raise the flaps. Chapanis’s solution: attach a small rubber wheel to the landing gear control and a flap-shaped wedge to the flap control. Pilots could immediately feel which lever was the right one, and the problem went away. In this case it was clear that the problem wasn’t the pilot, but the design of the technology that surrounded him. (Top right for gear and flaps)
Considering the above, and diving in particular, the following should be considered true.
- The best people can make the worst mistake
- Systems will never be perfect
- Humans will never be perfect
The fear of reporting, either in a constructive manner on social media, or reporting through official channels when required, is indicative of the lack of a Just Culture.
Just Culture
So what is a Just Culture? it is one that is open, fair, and a learning culture, and combined with the design of safe systems and managing behavioural choices, it creates an effective safety culture. The majority of research into Just Culture has been conducted in formal & established environments and so the framework is normally about punishments framed around legal or HR requirements/rules. The majority of diving is not like this, but learning across the community is still essential as there is no way in which divers can be taught everything on a training course, and some of the lessons that could be taught in the real world could end up with someone dead!
One of the myths of the Just Culture is that it is blame free. This is not the case. If reckless behaviours have been undertaken when there is a duty of care, then it needs to be reported in the interest of safety. When outside of the duty of care construct, then risk is managed at an individual level, but risk perception and acceptance are individually referenced, therefore the feedback system is different. Not reporting your error, preventing the system from learning is the greatest problem of all and some actions do warrant disciplinary or enforcement action (when operating in an environment with a duty of care, be that voluntary or commercial), or some means of correcting what has happened so it doesn’t happen again in other environments.
The key question is; where do you draw the disciplinary line for those with a duty of care, or negative criticism when outside of a duty of care? In order to know that, we all need to understand the differences between human error, risky behaviour and recklessness.
The big three to consider are:
- Human error: inadvertent action; inadvertently doing other that what should have been done; slip, lapse, mistake
- Risky behaviour: choices that increase risk, where risk is not recognised or is mistakenly believed to be justified – includes violations and negligence
- Reckless behaviour: behavioural choice, intentional acts, conscious disregard to a substantial and unjustifiable risk
The next blog will look at these subjects in more detail and ways in which we can address the problems we face. These include learning from incidents, accidents and failure, and creating the environment whereby divers can talk about failure. If reporting isn’t happening at a commercial level, it is likely that there are bigger issues to resolve than just the immediate fallout from the current event such as why real events like DCS are not being reported?
Until then, consider that whilst we don’t have a formal disciplinary framework for the majority of diving, we do have social judgment and you are part of that. If you see something adverse happening, hard as it may be, don’t make a judgement of why they could have made such a silly mistake, rather consider how they determined that it was a ‘safe’ decision. That decision will be based on (in)experience, motivations, previous outcomes from similar situations and training which are all likely to be different to yours.
Learn from your mistakes, better still, learn from others’
Footnote:
Human Factors Skills in Diving classes teaches us, in a safe space, to learn from others, to improve our own safety methods and attitudes, it provides food for thought on how to provide a better, easier learning environment for own students and allows us to be real human beings. There is intentionally lots of failure to learn from on the course.
More information on Human Factors Skills in Diving classes can be found at www.humanfactors.academy
Upcoming classroom-based course dates are here Training Dates
Online micro-class (9 modules of approximately 15 mins each) details are here Human Factors Skills in Diving Online Mini-Course