My Journey into UTD Ratio Deco

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

It would make a legitimate question if this method takes you out of the safety margin generated by your dive computer. If it accurately mimics your dive computer then whatever scientific basis you have for your computer will also support this method.

I think a different way to express your particular concern would be, what gradient factor would I be riding on BuhlmannZHL-16C if I did this to the most aggressive limits acceptable by UTD standards?

The most aggressive you could go on this method while still remaining within UTD agency standards would be to do 3 dives a day while keeping the same min-deco table and same ascent schedule and keeping the minimum surface interval to 60 mins. Even with surface interval being minimum, bottom times being maximum and ascent schedules remaining unmodified, it is hard to break gradient factor 100/100 which is what you hit when you dive conventional tables to their square profile, max limits. So in the end the scientific basis for this would be its ability to keep you within safety parameters defined by BuhlmannZHL-16C. No?
No, I asked for the scientific basis for this approach, and your reply only pretends to answer that question.
 
There is a DIR forum where people can post DIR experiences without any interference from people asking pesky questions. This thread can be moved there. The OP just has to ask.
 
A bottom timer is "required equipment".
Interesting to note that the suggested one provides average depth information - are UTD divers expected to use that information or calculate the running total themselves?

Also interested in understanding how they depth average. Is it simply (previous average + the new dive depth) divided by two (the rolling average) or is it the actual average of the depths? Reason I ask is that table of depths in post 15 average to 84.33 and not the 90 described. Am I missing something? What happens if you know you have varied depth in between the 5 min averages (such as rising to cross a coral formation and then dropping back down)?

If it is the rolling average then the adage of "measure with a micrometer, mark with chalk and cut with an axe" came to mind but it seems more like "measure to the nearest foot, mark with spray paint and cut with an axe".

I should note that I am not for or against any system but simply want to understand it
 
Last edited:
There is a DIR forum where people can post DIR experiences without any interference from people asking pesky questions. This thread can be moved there. The OP just has to ask.
No

this is NOT dir
 
It would make a legitimate question if this method takes you out of the safety margin generated by your dive computer. If it accurately mimics your dive computer then whatever scientific basis you have for your computer will also support this method.

I think a different way to express your particular concern would be, what gradient factor would I be riding on BuhlmannZHL-16C if I did this to the most aggressive limits acceptable by UTD standards?

The most aggressive you could go on this method while still remaining within UTD agency standards would be to do 3 dives a day while keeping the same min-deco table and same ascent schedule and keeping the minimum surface interval to 60 mins. Even with surface interval being minimum, bottom times being maximum and ascent schedules remaining unmodified, it is hard to break gradient factor 100/100 which is what you hit when you dive conventional tables to their square profile, max limits. So in the end the scientific basis for this would be its ability to keep you within safety parameters defined by BuhlmannZHL-16C. No?
it quickly diverges from your computer once you start doing real (non min deco) diving. I'm curious how your class will explain it away. the gue answer for these situations is that ratio deco is wrong, because they just made it up...
 
Last edited:
RD gives you options. You can choose how precise to make your average depth calculation. The method Captain Sinbad detailed is NOT even in the book. The book details two methods, but doesn't say you have to limit yourself to only those two methods. And after getting your average, the book recommends you weight the average either deeper or shallower - depending on how conservative you think it should be. And there is nothing that says you can't compare the average you calculated with the average displayed on your bottom timer.

The point is to create a thinking diver. After you have learned the tool of RD, you can dive your computer if you want. But in a UTD class, you will practice RD so you can learn that it's not that hard, how useful it is, and have it as an available tool.
 
Last edited:
depth averaging is fine. I've done hundreds of dives using depth averaging and a bottom timer. UTD's ratio deco is not fine.

definitely not dir. if your name isn't on an arrow in the woodville karst plain you dont get to slap a DIR stamp on anything IMO
 
https://www.shearwater.com/products/perdix-ai/

Back
Top Bottom