Tale of 2 computers on 2 dives

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

I only used the Luna on those 2 dives and 1 or 2 other dives of the series, not all of them which may be the difference.
 
I only used the Luna on those 2 dives and 1 or 2 other dives of the series, not all of them which may be the difference.
yes indeed; that too would make a difference.

Alberto (aka eDiver)
 
I only used the Luna on those 2 dives and 1 or 2 other dives of the series, not all of them which may be the difference.
So for the dives in question, the two computers weren't starting out with the same nitrogen-loading debt.

If that's the case, very little can be learned from the "experiment."
 
Dives 5 & 6 were done in 1 day, starting with dive 5 at 0936am

Dives 1,2,3 & 4 were the previous day and dive 4 ended at 1553.

There was 17 hours and 43 minutes between the end of dive 4 and the beginning of dive 5 the next day.

SI= 17:43
 
The Suunto (and RGBM or other "dual phase" algorithms in general) will dictate increased conservatism if there is multi-day diving, and also for deeper than previous dives. There are solid theoretical reasons for this, having to do with the buildup of dissolved nitrogen and seed bubbles that do not rapidly dissolve- and the desire to keep these seeds below the size where they will grow. That could explain why the Suunto went into deco first (though with a short, shallow stop) and kept adding time during the slow ascent. It sees you as adding DCS risk even though you are pausing at 1/2 max depth in your ascent- and it may be correct. In Eric Baker's immortal words, all decompression algorithms are attempts to "draw a bright, clear line through a fuzzy grey area". Liability being what it is, there is no provision for a "slightly" missed stop. Suunto is very conservative in their settings, but there is more going on here than just that.

Bubbletrouble is correct about the manual and ascent rates/ times. The ascent time calculation is based on a 33 fpm ascent rate. The manual is just saying that if you go up slower than 33 fpm (or don't go up, they point that out as well, duh) the ascent will take longer, not that additional penalties will apply. RGBM does not countenance ascent rates faster than the 33 fpm, at least at shallower depths, and I'm sure the computer would penalize you for ascending faster than this for any extended stretch. While a case can be made for faster ascents from deep dives, I don't think anyone is recommending 60 fpm as a general rule any more, we now understand that it's too fast.

Ron
 
Does the algorithm take into consideration air use when the transmitter is active? I was told yes by my dive shop, I suspect no.
 
....There was 17 hours and 43 minutes between the end of dive 4 and the beginning of dive 5 the next day.
SI= 17:43
yes, but apparently not long enough to completely dissolve all the nitrogen accumulated in the "slow" compartments of the mathematical model.
 
Does the algorithm take into consideration air use when the transmitter is active? I was told yes by my dive shop, I suspect no.
The Galileo Sol/Luna can be set to take into account the workload according to minute ventilation, in the algorithm calculations. You can even change the sensitivity of this effect.
 
The Galileo Sol/Luna can be set to take into account the workload according to minute ventilation, in the algorithm calculations. You can even change the sensitivity of this effect.
I've always been a bit skeptical of these implementations, though. Breathing rates vary for many reasons other than workload- like the individual's size. Unless you have a typical reference for the individual stored in the computer, extrapolating a workload is pretty iffy. Then there's the question of what it actually means for their physiological status. The same could be said for water temperature- you can be warm in cold water and cold in warm water, depending on insulation. Algorithms are yardsticks, not micrometers- you can only expect them to measure to a certain level of accuracy.
 

Back
Top Bottom