Introduction
Humans are subject to a variety of cognitive biases and heuristics (mental shortcuts). These directly impact the decision-making process of the diver, sometimes resulting in incorrect judgments when certain circumstances dictate. In most settings this can be relatively harmless, however, for those operating in high-risk environments, the consequences of an incorrect decision can be critical or deadly, especially if there is a short period of time between error detection and the ability to recover the error. The bias itself impacts the divers' perception of reality, changing their understanding of the situation, and filtering the true nature of the events as they unfold. These cognitive issues are further compounded by physiological factors in diving like narcosis, reduced colour perception and visibility, and the changes in sound transmission when underwater.
The effects on human perception
Human perception is a “conscious sensory experience” that utilises a combination of our senses and our brain to filter and read those sensory inputs. Research has revealed that there are a number of common ways that our brain’s perception is modified. While these biases serve as filters, hindering our ability to make accurate decisions, they are also considered essential to deal with the massive amounts of information which we have to deal with in short periods of time. This blog covers this reduction process in more detail. The problem we face in real time in high-risk scenarios is that often we are unaware of this filtering and reality modification process.
Types of cognitive bias
There are many types of cognitive bias that can influence divers’ safety because they impact risk perception and acceptance. The following are some examples of biases that can be particularly dangerous to divers:
Ambiguity effect
An aspect of decision theory where a person is more likely to select something that has an intuitively clear risk as opposed to one that seems relatively less certain. This could lead someone to choose a more risky option, albeit one with a more certain risk. For example, a CCR stays on the loop when there is a fault in the rebreather compared to bailing out and making an ascent on open circuit.
Anchoring bias
A bias where people will make decisions based on a provided data point, for example, if given a baseline of a certain amount of gas as a requirement or a depth for the dive, this number will be utilised in determining requirements regardless of whether operational needs might actually require much more, or much less. This might be 'surface with 50 bar/500 psi' but there is no understanding what this number means in terms of cylinder size, depth or breathing rate.
Attentional bias
Humans pay more attention to things that have an emotional aspect to them. In diving, this could lead to a person making a decision based on a perceived problem due to a past experience. For example, if a diver has had a DCI event or someone close to them has, they might ignore the risk of running low on gas in an attempt to avoid DCI. An incident was recounted to me about a diver who ran out of deco gas because they didn't understand the mechanism by which the 6m stop was managed on their SUUNTO and was afraid of getting bent, despite being close to 6m for a long time.
Attentional tunnelling
This has been defined as “the allocation of attention to a particular channel of information, diagnostic hypothesis, or task goal, for a duration that is longer than optimal, given the expected cost of neglecting events on other channels, failing to consider other hypotheses, or failing to perform other tasks”. This can be more simply explained as the ‘7 +/- 2 lightbulbs’ of mental capacity. If a number of those lightbulbs are taken up with basic tasks, the capacity to monitor other tasks is limited. This is despite the risks of not completing those 'apparently' secondary tasks. An example of this would be shooting UW video and not monitoring pO2, as in the case of Wes Skiles. (The incident was much more complex than I have just explained but attentional tunnelling was a major contributory factor).
Automaticity
While not a bias, this refers to the fact that humans who perform tasks repeatedly will eventually learn to perform them automatically - so-called muscle memory. While generally a positive attribute, this can lead to a person automatically performing a function (such as a checklist item) without actually being cognisant of the task itself. Expectation bias can lead them to assume that the item is correctly configured even if it is not.
Availability heuristic
This describes how people will over-estimate the likelihood of an event based upon the emotional influence the event may have had, or how much personal experience a person may have had with that type of event. This can lead to incorrect assessments of risk, with some events being attributed more risk than they should, and others not enough. An example might be how much focus is placed on DCI (a pretty rare event) compared to running low or out of gas, which is much more common.
Availability cascade
This is a process where something repeated over and over will become to be accepted as a fact. An example of this is the misconception that diving nitrox extends your bottom time AND makes diving safer on the same dive. Rather, minimum decompression times can be extended given the same level of risk of DCI, or the risk of DCI can be reduced if the minimum decompression time for air at the same depth is used. e.g. the same level of decompression requirement exists for 32% nitrox at 30m and 30mins or if using air, 30m and 20mins, but you cannot be safer and have the longest bottom time.
More... 1/3
Humans are subject to a variety of cognitive biases and heuristics (mental shortcuts). These directly impact the decision-making process of the diver, sometimes resulting in incorrect judgments when certain circumstances dictate. In most settings this can be relatively harmless, however, for those operating in high-risk environments, the consequences of an incorrect decision can be critical or deadly, especially if there is a short period of time between error detection and the ability to recover the error. The bias itself impacts the divers' perception of reality, changing their understanding of the situation, and filtering the true nature of the events as they unfold. These cognitive issues are further compounded by physiological factors in diving like narcosis, reduced colour perception and visibility, and the changes in sound transmission when underwater.
The effects on human perception
Human perception is a “conscious sensory experience” that utilises a combination of our senses and our brain to filter and read those sensory inputs. Research has revealed that there are a number of common ways that our brain’s perception is modified. While these biases serve as filters, hindering our ability to make accurate decisions, they are also considered essential to deal with the massive amounts of information which we have to deal with in short periods of time. This blog covers this reduction process in more detail. The problem we face in real time in high-risk scenarios is that often we are unaware of this filtering and reality modification process.
Types of cognitive bias
There are many types of cognitive bias that can influence divers’ safety because they impact risk perception and acceptance. The following are some examples of biases that can be particularly dangerous to divers:
Ambiguity effect
An aspect of decision theory where a person is more likely to select something that has an intuitively clear risk as opposed to one that seems relatively less certain. This could lead someone to choose a more risky option, albeit one with a more certain risk. For example, a CCR stays on the loop when there is a fault in the rebreather compared to bailing out and making an ascent on open circuit.
Anchoring bias
A bias where people will make decisions based on a provided data point, for example, if given a baseline of a certain amount of gas as a requirement or a depth for the dive, this number will be utilised in determining requirements regardless of whether operational needs might actually require much more, or much less. This might be 'surface with 50 bar/500 psi' but there is no understanding what this number means in terms of cylinder size, depth or breathing rate.
Attentional bias
Humans pay more attention to things that have an emotional aspect to them. In diving, this could lead to a person making a decision based on a perceived problem due to a past experience. For example, if a diver has had a DCI event or someone close to them has, they might ignore the risk of running low on gas in an attempt to avoid DCI. An incident was recounted to me about a diver who ran out of deco gas because they didn't understand the mechanism by which the 6m stop was managed on their SUUNTO and was afraid of getting bent, despite being close to 6m for a long time.
Attentional tunnelling
This has been defined as “the allocation of attention to a particular channel of information, diagnostic hypothesis, or task goal, for a duration that is longer than optimal, given the expected cost of neglecting events on other channels, failing to consider other hypotheses, or failing to perform other tasks”. This can be more simply explained as the ‘7 +/- 2 lightbulbs’ of mental capacity. If a number of those lightbulbs are taken up with basic tasks, the capacity to monitor other tasks is limited. This is despite the risks of not completing those 'apparently' secondary tasks. An example of this would be shooting UW video and not monitoring pO2, as in the case of Wes Skiles. (The incident was much more complex than I have just explained but attentional tunnelling was a major contributory factor).
Automaticity
While not a bias, this refers to the fact that humans who perform tasks repeatedly will eventually learn to perform them automatically - so-called muscle memory. While generally a positive attribute, this can lead to a person automatically performing a function (such as a checklist item) without actually being cognisant of the task itself. Expectation bias can lead them to assume that the item is correctly configured even if it is not.
Availability heuristic
This describes how people will over-estimate the likelihood of an event based upon the emotional influence the event may have had, or how much personal experience a person may have had with that type of event. This can lead to incorrect assessments of risk, with some events being attributed more risk than they should, and others not enough. An example might be how much focus is placed on DCI (a pretty rare event) compared to running low or out of gas, which is much more common.
Availability cascade
This is a process where something repeated over and over will become to be accepted as a fact. An example of this is the misconception that diving nitrox extends your bottom time AND makes diving safer on the same dive. Rather, minimum decompression times can be extended given the same level of risk of DCI, or the risk of DCI can be reduced if the minimum decompression time for air at the same depth is used. e.g. the same level of decompression requirement exists for 32% nitrox at 30m and 30mins or if using air, 30m and 20mins, but you cannot be safer and have the longest bottom time.
More... 1/3