Copied from the original site linked at the bottom
Human error is normal. Human error is part of the way we learn. It is almost impossible to remove human error from any system. Therefore, 'Human error' should not be the conclusion of an investigation. If it is, then we are not likely to improve the situation for the future. Depending on the outcome of the error or errors, the impact can be minor or it can be fatal, the problem is we don't necessarily know the scale of the issue until after the event.
In the last blog I covered the basic concept of a Just Culture and why it is essential to have this if we are to improving safety. We need to be able to talk about the errors or violations (at risk behaviours) that occur, and the reasons why it made sense to us at the time if we are to improve performance and safety, and reduce the likelihood of the same adverse event happening again. In this blog I am going to talk about 'human error' and the differences that exist within this overly-simple classification. The next two blogs will cover 'Risk behvaiours' and 'Reckless behaviours' which sit on the right-hand side of this model.
Human Error
Human Error can be defined as “the failure of a planned action to be completed as intended, or the use of a wrong plan to achieve an aim. Errors can include problems in practice, products, procedures, and systems.”. This could be as simple as picking up the wrong keys, driving past the turn-off on the way to a friends house, picking up strawberry jam instead of raspberry, or in the context of diving, forgetting to write the correct gas analysis on the tape, miscalculating a maximum operating depth or turn pressure, running out of gas or losing our buddy.
We all make errors every day with generally minimal consequences. In aviation, research has shown pilots make in the order of 3-6 errors per hour and yet they still don't crash that often - fortunately! In diving the errors we make also have the potential for dire consequences. Again, fortunately many errors are picked up by the diver before it becomes too late. However, sometimes we get distracted, or miss the clues and cues which would identify the problem developing. Clues and cues which are easily identifiable after the event because we are working backwards joining the dots rather than trying to create a dynamic jigsaw puzzle without all the pieces or knowing where it will end.
The following diagrams show this. The first is a very simple linear decision treee which leads to an adverse outcome. The decisions made are shaped by those factors in Situational Awareness model from previous blogs. By the time we get to the adverse outcome, there are 27 possible outcomes. (It is recognised that in the real world we have mulitple parallel and serial decision making processes happening at the same time and therefore life is far more complicated than this!!)
The second diagram starts from the outcome and works backwards.
Most people when looking at an incident look for clues and cues to back up our hypothesis knowing what happened. e.g. the diver ran out gas, that's because they weren't monitoring their consumption, because they were poorly trained...but the evidence as you move further back down the path gets weaker. It is easy to blame the indivudual and tell them "remember to monitor your gas" - that is like saying "don't walk on the motorway because you'll get hit by a car". However, in the latter case, the risks are more easily recalled and so the warning is more likely to be adhered to. Evidence showing the number of divers running out of gas, but still surviving is really poor yet out of gas is the most common trigger for fatalities according to DAN.
For instructors there is an increased level of pressure not to make a mistake. They are expected to be somehow ‘above’ human fallibility and not make mistakes. We need to understand that individuals, instructors and non-instructors alike, do not intend the mistake or error or undesirable outcome even though the consequences are potentially life threatening. Many discussions use phrases such as 'why did they make the choice' or 'they made an error' which infers that a choice was actively made. However, as we have seen in these two blogs (here and here), many of the decisions we make are subconscious and therefore there is no active choice.
I sometimes get asked about how to deal with individuals who make repeated errors. First off we need to understand the type of errors that are being made. Are the errors due to lack of skill, knowledge or training, or is it because they are in an error prone environment such as a busy filling station with lots of pressure to mix different fills against the clock, or a dive centre which pays their instructor 'per student' and time is the limiting factor? Has the equipment been designed to reduce error, or is it known that there are workarounds to get the job done due to poor design? Are the individuals stressed, distracted or unfocussed, all of which will lead to a greater likelihood of an error occuring. Furthermore, once someone has made an error in an environment where errors aren't tolerated (losing their job or critical outcome), they are likely to be more stressed. Therefore we first must understand the system aspects of the adverse event before we can determine the best way forward.
Human error is normal. Human error is part of the way we learn. It is almost impossible to remove human error from any system. Therefore, 'Human error' should not be the conclusion of an investigation. If it is, then we are not likely to improve the situation for the future. Depending on the outcome of the error or errors, the impact can be minor or it can be fatal, the problem is we don't necessarily know the scale of the issue until after the event.
In the last blog I covered the basic concept of a Just Culture and why it is essential to have this if we are to improving safety. We need to be able to talk about the errors or violations (at risk behaviours) that occur, and the reasons why it made sense to us at the time if we are to improve performance and safety, and reduce the likelihood of the same adverse event happening again. In this blog I am going to talk about 'human error' and the differences that exist within this overly-simple classification. The next two blogs will cover 'Risk behvaiours' and 'Reckless behaviours' which sit on the right-hand side of this model.
Human Error
Human Error can be defined as “the failure of a planned action to be completed as intended, or the use of a wrong plan to achieve an aim. Errors can include problems in practice, products, procedures, and systems.”. This could be as simple as picking up the wrong keys, driving past the turn-off on the way to a friends house, picking up strawberry jam instead of raspberry, or in the context of diving, forgetting to write the correct gas analysis on the tape, miscalculating a maximum operating depth or turn pressure, running out of gas or losing our buddy.
We all make errors every day with generally minimal consequences. In aviation, research has shown pilots make in the order of 3-6 errors per hour and yet they still don't crash that often - fortunately! In diving the errors we make also have the potential for dire consequences. Again, fortunately many errors are picked up by the diver before it becomes too late. However, sometimes we get distracted, or miss the clues and cues which would identify the problem developing. Clues and cues which are easily identifiable after the event because we are working backwards joining the dots rather than trying to create a dynamic jigsaw puzzle without all the pieces or knowing where it will end.
The following diagrams show this. The first is a very simple linear decision treee which leads to an adverse outcome. The decisions made are shaped by those factors in Situational Awareness model from previous blogs. By the time we get to the adverse outcome, there are 27 possible outcomes. (It is recognised that in the real world we have mulitple parallel and serial decision making processes happening at the same time and therefore life is far more complicated than this!!)
The second diagram starts from the outcome and works backwards.
Most people when looking at an incident look for clues and cues to back up our hypothesis knowing what happened. e.g. the diver ran out gas, that's because they weren't monitoring their consumption, because they were poorly trained...but the evidence as you move further back down the path gets weaker. It is easy to blame the indivudual and tell them "remember to monitor your gas" - that is like saying "don't walk on the motorway because you'll get hit by a car". However, in the latter case, the risks are more easily recalled and so the warning is more likely to be adhered to. Evidence showing the number of divers running out of gas, but still surviving is really poor yet out of gas is the most common trigger for fatalities according to DAN.
For instructors there is an increased level of pressure not to make a mistake. They are expected to be somehow ‘above’ human fallibility and not make mistakes. We need to understand that individuals, instructors and non-instructors alike, do not intend the mistake or error or undesirable outcome even though the consequences are potentially life threatening. Many discussions use phrases such as 'why did they make the choice' or 'they made an error' which infers that a choice was actively made. However, as we have seen in these two blogs (here and here), many of the decisions we make are subconscious and therefore there is no active choice.
I sometimes get asked about how to deal with individuals who make repeated errors. First off we need to understand the type of errors that are being made. Are the errors due to lack of skill, knowledge or training, or is it because they are in an error prone environment such as a busy filling station with lots of pressure to mix different fills against the clock, or a dive centre which pays their instructor 'per student' and time is the limiting factor? Has the equipment been designed to reduce error, or is it known that there are workarounds to get the job done due to poor design? Are the individuals stressed, distracted or unfocussed, all of which will lead to a greater likelihood of an error occuring. Furthermore, once someone has made an error in an environment where errors aren't tolerated (losing their job or critical outcome), they are likely to be more stressed. Therefore we first must understand the system aspects of the adverse event before we can determine the best way forward.