From 24 FSW to the surface there is a drop of about 42%. From 15 FSW to 8,000 feet is a drop of about 50%. It is, I believe, the percent change, not the absolute change that is the critical factor.
In a saturation model (e.g. Haldane), the absolute pressure gradient controls the physics. Perhaps this would not be true in a perfusion/diffusion model (e.g. RGBM).
Also, perhaps my math is suspect, but I get a 40.3092% change in pressure from 24 fsw to the surface: 24fsw = 100*(1 - [1.0832501/(1.0832501 + 24/32.80839895)]) - note that 1 atm surface pressure = 1.0832501 bar on a "standard" day. Originally, I used approximations (e.g. 1 + 24/33, etc).
To replicate a 40.3092% delta pressure gradient using an initial (instantaneous) ascent from 15fsw to the surface would require an instantaneous further ascent to 3701.544' MSL. To replicate the 42% pressure gradient would require a further instantaneous ascent to 4264.481' MSL
For reference, from the 1976 U.S. Standard Atmosphere: pressure in the Troposphere (below 36089'):
p = p0(1+ah/T0)^5.2561.
Rearranging for h:
h = T0/a * (exp[ln(p/p0)/5.2561]-1)
Where: p = pressure at altitude, p0 = Sea level pressure (in same units as result), a = temperature lapse rate in the troposphere (-0.003566 degF/ft), T0 = standard temperature at sea level (518.67 Rankin), h is geometric height (in ft) [note: formula is somewhat different if geopotential height is used].
Tom