Dear fellow divers,,,,
Please forgive this post, the last thing I want to do is start another Mitchell vs. Hemingway debate but I have a interesting discussion point I would like to get all your learned viewpoints on. My apologies if it is covered elsewhere, in which case please redirect.
In reference to the recent discussions on the fact that the Helium behaves similar to Nitrogen in terms of the decompression times, and the "Helium penalty" in deco algorithms work, although probably for the wrong reason, as well as articles such as this
Deep Helium ADVANCED DIVER MAGAZINE By B.R. Wienke and T.R. OLeary
I am would like to suggest the following:
Using a Heliox gas on CCR, 10/90, but putting in a mix that equates to an END of 30m in my dive computer so that it calculates a "safe" but faster deco profile based on the depth that I am diving (So Something like 10/65 for a 90m dive). Assuming that the Helium penalty was an incorrect assumption a high helium content gas is the safer gas, (WoB, Gas density, Narcotic effect etc) and on a CCR not much more expensive. Therefore the assumption is using this gas but with the previous algorithms in the computer for an END of 30m would provide a relatively safe decompression profile. The O2 content being accurate of course.
What would the reasons be why this should not work? In other words besides the fact that there is no concrete data to support any of this why should this NOT work? (yes theoretical exercise only ) The only thing I can think of is perhaps IBCD if I bailout, but not if I do those mixes well? Assuming that the inert gas calculations done for decompression is pretty much a theoretical mix to provide a safe profile for a 30m end what would be the other reasons?
Cheers
PS- I am sure many would now grab the popcorn, but I assure you this is not an antagonistic post. Just trying to think of other reasons why that would not work?
Please forgive this post, the last thing I want to do is start another Mitchell vs. Hemingway debate but I have a interesting discussion point I would like to get all your learned viewpoints on. My apologies if it is covered elsewhere, in which case please redirect.
In reference to the recent discussions on the fact that the Helium behaves similar to Nitrogen in terms of the decompression times, and the "Helium penalty" in deco algorithms work, although probably for the wrong reason, as well as articles such as this
Deep Helium ADVANCED DIVER MAGAZINE By B.R. Wienke and T.R. OLeary
I am would like to suggest the following:
Using a Heliox gas on CCR, 10/90, but putting in a mix that equates to an END of 30m in my dive computer so that it calculates a "safe" but faster deco profile based on the depth that I am diving (So Something like 10/65 for a 90m dive). Assuming that the Helium penalty was an incorrect assumption a high helium content gas is the safer gas, (WoB, Gas density, Narcotic effect etc) and on a CCR not much more expensive. Therefore the assumption is using this gas but with the previous algorithms in the computer for an END of 30m would provide a relatively safe decompression profile. The O2 content being accurate of course.
What would the reasons be why this should not work? In other words besides the fact that there is no concrete data to support any of this why should this NOT work? (yes theoretical exercise only ) The only thing I can think of is perhaps IBCD if I bailout, but not if I do those mixes well? Assuming that the inert gas calculations done for decompression is pretty much a theoretical mix to provide a safe profile for a 30m end what would be the other reasons?
Cheers
PS- I am sure many would now grab the popcorn, but I assure you this is not an antagonistic post. Just trying to think of other reasons why that would not work?