Abdullah,
Hindsight is a great thing because you can look back at the things that you wished you had done differently but at the time, they might not have been as obvious as they should have been.
Did you read the blog article which was also linked? Did you watch the video? Human fallibility is a real problem, especially when it comes to monitoring life support equipment.
There is also a known phenomenon called 'Normalisation of Deviation' whereby the operator or organisation drifts from the norm in small increments and each time that deviation occurs and nothing goes wrong, then the poor decision is validated. Then deviation starts from that new norm. When the accident happens, you only see the deviation from the original baseline to the accident, rather than the small incremental steps. Look up how the Challenger disaster occurred.
Also something to bear in mind, at the time getting hold of reliable cells for AP units wasn't easy and sometimes it is 'better the devil you know' influenced by your natural biases, than get hold of cells which didn't have a strong record in terms of reliability. When you are earning money, there is an increased pressure to undertake the task which would be contrary to 'normal' operations.
I recently posted this blog elsewhere and got the following comment (paraphrased and cleaned up from the OP).
The thing to take away from this, we ALL make mistakes and that detailed reports are essential if we are to try and improve performance and safety. We also need to be aware that in a number of situations the only person who really knows why they made those decisions is no longer with us and therefore to negatively judge without knowing the whole background or context does not help the community come forward when they have made honest mistakes (or violations) so that others can learn from them.
Regards
Gareth
Cognitas Incident Research and Management
DISMS | Home
Hindsight is a great thing because you can look back at the things that you wished you had done differently but at the time, they might not have been as obvious as they should have been.
Did you read the blog article which was also linked? Did you watch the video? Human fallibility is a real problem, especially when it comes to monitoring life support equipment.
There is also a known phenomenon called 'Normalisation of Deviation' whereby the operator or organisation drifts from the norm in small increments and each time that deviation occurs and nothing goes wrong, then the poor decision is validated. Then deviation starts from that new norm. When the accident happens, you only see the deviation from the original baseline to the accident, rather than the small incremental steps. Look up how the Challenger disaster occurred.
Also something to bear in mind, at the time getting hold of reliable cells for AP units wasn't easy and sometimes it is 'better the devil you know' influenced by your natural biases, than get hold of cells which didn't have a strong record in terms of reliability. When you are earning money, there is an increased pressure to undertake the task which would be contrary to 'normal' operations.
I recently posted this blog elsewhere and got the following comment (paraphrased and cleaned up from the OP).
The most telling thing, well to me anyway is the coroner would not confirm due to a lack of actual evidence that the diver opted not to calibrate the unit before the dive because it was suggested that he new he had a high chance of it not calibrating with the net result of cancelling what for him was an instructing/earning dive. I have actually considered doing just this a couple years ago when having issues with AP14 cells ! I had a set ( under 1 year old ) working at home but was not confident they would calibrate again the following day just before the dive.
I remember making a conscious decision not to calibrate unit as I was sure the cells would crap out. It was a dive I had wanted to do for years with cost time etc etc all factored in. I actually walked away from the unit to consider the plan and only decided to calibrate when I asked around at how many spare cells on the boat, I had run out! In the end I did Calibrate on the boat and they did work! But the point is I considered not doing it.
The thing to take away from this, we ALL make mistakes and that detailed reports are essential if we are to try and improve performance and safety. We also need to be aware that in a number of situations the only person who really knows why they made those decisions is no longer with us and therefore to negatively judge without knowing the whole background or context does not help the community come forward when they have made honest mistakes (or violations) so that others can learn from them.
Regards
Gareth
Cognitas Incident Research and Management
DISMS | Home