If the underlying theory of Buhlmann is even close to correct:
- Gradient Factor is the equivalent of "Normalized Tissue Supersaturation".
- Higher tissue supersaturation is more dangerous than lower tissue supersaturation.
- More time spent at high tissue supersaturation is more dangerous that less time.
These are exactly the assumptions that go into the common risk models. Expressed as a formula in the form P(DCS) = 1 - exp(- integral(sum(r_i), dt)), where r_i is the instantaneous risk function function for a given compartment very close to proportional to the normalized saturation.
See for example:
Weathersby PK, Homer LD, Flynn ET. On the likelihood of decompression sickness. J Appl Physiol Respir Environ Exerc Physiol. 1984 Sep;57(3):815-25. doi: 10.1152/jappl.1984.57.3.815
Thalmann ED, Parker EC, Survanshi SS, Weathersby PK. Improved probabilistic decompression model risk predictions using linear-exponential kinetics. Undersea Hyperb Med. 1997 Winter;24(4):255-74. PMID: 9444058.
E. C. Parker, S. S. Survanshi, P. B. Massell, and P. K. Weathersby Probabilistic models of the role of oxygen in human decompression sickness. Journal of Applied Physiology 1998 84:3, 1096-1102 doi:10.1152/jappl.1998.84.3.1096
Howle LE, Weber PW, Nichols JM. Bayesian approach to decompression sickness model parameter estimation. Comput Biol Med. 2017 Mar 1;82:3-11. doi: 10.1016/j.compbiomed.2017.01.006
Everything else in this theory operates on log curves, so it's just as likely that this one's a log curve too. That would mean first halving of the M-value results in, what, 4% reduction of risk? I'm sure it's worth it, better safer than safe, right?
Do you have a cite to back any of your dearly held beliefs?
I'm pretty sure you have the logarithm on the wrong side of the equation. See the formula above, or one of the references for a more exact specification. A reduction in the
integrated time at a given gradient results in an exponential reduction in risk in these models.
What is the probability of DCS at a GF high of 100, 85,75 and 50?
There is some software and raw code floating around to implement the above probabilistic risk estimates. The parameter estimation is a bit tricky as noted in the papers above. You can calulate the profile for a given GF then estimate the risk for that profiles with one of the algoritms, e.g. NMRI98 or BVM.
In the case of US Navy limits, this means that you incur a residual risk of DCS in the order of 1 case every 1000 dives.
I think diving at the limits on the Navy Tables is a bit higher risk than that, (iirc there is a section in the manual that requires a chamber being available). In Gerth, Wayne A. and D J Doolette. “VVal-18 and VVal-18M Thalmann Algorithm Air Decompression Tables and Procedures.” (2007). The section "Comparative Analyses of Estimated DCS Risks and Total Stop Times of Tabulated Schedules," seems to indicate the schedules are not iso-risk but order low-single digit %.
I provided one reasonable answer: a GF = 50 halves the risk.
Do you have a better GF=>risk relationship?
I see you found the non-linearity, but of course the expoential expands linearly to first order as do most functions....
Digging into the sources a bit it looks like they regress across a decent number of dives under a reasonably wide range of conditions, and produce statistically significant DSC probablity estimates. Given that, you probably can infer
something about the risk by calulating a given profile and then probogating them through the risk models.