LOL. And I throw away most of the "facts" I read on ScubaBoard. What's your point?I have it on good authority that physicists throw away almost every data point coming from the sensors of instruments like particle colliders.
Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.
Benefits of registering include
LOL. And I throw away most of the "facts" I read on ScubaBoard. What's your point?I have it on good authority that physicists throw away almost every data point coming from the sensors of instruments like particle colliders.
LOL. And I throw away most of the "facts" I read on ScubaBoard. What's your point?
The processing required to calculate algorithms like Buhlmann or the “folded” RGBM (those calculations actually being Haldanian) is minimal. Processor speed is unlikely to be a limiting factor. When you get to fully iterative bubble models, like “full” RGBM or VPM, then the processing required can be more significant, at least for longer profiles with a lot of deco time. But I’d be surprised if there are computers now that don’t run algorithm calculations at essentially the same rate at which they sample data, i.e. every second, or two, or three.
Algorithms aren’t monolithic in their implementation. There are choices about how to run different aspects, and when and how often to display information. You definitely want to use the processor efficiently to save battery power- that is in fact a huge issue. I think it’s safe to assume that all micros used in dive computers have multiple options for low power modes they can be put into. But dive computers are real time systems, and you really need to keep the tissues updated in real time along with depth. However that’s only part of the algorithm. This does not mean you are recalculating schedules or no-stop times that frequently. I don’t think I was being clear about that.If your microprocessor has IDLE/HLT/WFI you can save a lot of battery power not recalculating needlessly. If it doesn't then it's moot, and I've no idea if any processors used in real dive computers do.
...<snip>
If you take multiple samples in one update,, you can say going up or down within the update period. You can also avg 20 samples and use that for the update every 20 seconds. Doing that would dampen out the other wise larger variations of moving your arm and falling out of a stop depth band. A 30 +-1 ft stop and tide or wave surge of 2 ft would average out to a more constant avg depth and not generate an incomplete stop.
If your microprocessor has IDLE/HLT/WFI you can save a lot of battery power not recalculating needlessly. If it doesn't then it's moot, and I've no idea if any processors used in real dive computers do.
Underlying gas dynamics is ass-u-me-d to be log(2) loading. 1st half-time tissue gets to 50% of ambient pressure, 2nd half-time adds 25%, and so on to 6 half-times where the tissue is "practically saturated" View attachment 502152
The part most affected by short time intervals is in the bottom-left corner.
The fastest tissue compartment is most sensitive to short time quanta. Taking the Cressi/ZH-L12 fastest TC with half-time of 2.5 minutes, or 150 seconds. Recalculating every second means that in the bottom left corner you're tracking the loading down to 1/150th of the 50% -- progressively less as your tissue takes on more gas. Which only really makes sense if your CPU consumes as much power in IDLE loop as it does calculating. Because the amount of gas taken on during that interval will make no appreciable difference to your deco/ndl calculation.
Every 20 seconds would give you around 1/8th of 50% in the bottom left corner, which may be worth tracking, even up to T3.
(I do have too much short idle time intervals on my hands this week.