No argument there. As the OP is contending, however, a given bubble doubles in volume when depth is changed from 500 to 233 ft. It also doubles going from 33 ft to 0. Substitute whatever size increase you wish for "doubles", and it remains that there can be a larger depth decrease the deeper you start while yielding the same size bubble."Decompression stress is defined as the amount of inert gas dissolved in various tissues throughout the body. During ascent, bubbles increase in size and are released by tissues into the veins.
In other words, the rate of bubble expansion is identical when going faster/deeper. Our imperfect off-gassing mechanism is able to "keep up" with that expansion rate when shallow, why can it not keep up with it when deeper?
I also see a distinction between ascent rate and deco stops. The latter is dependent on the ratio of tissue pressure to ambient. Empirically, we know you have to stop before that gets too big (e.g., Haldane's 2x). I'm not aware that any of that deco research investigated the ascent RATE while the ratio was below the threshold, but I'm happy to be corrected on that.
Back to the original hypothesis under discussion, the time taken to go from a tissue:ambient ratio of 1.0 to 2.0 (i.e., "reach the next stop") is identical between faster/deeper and slower/shallower. I understand your point that our off-gassing mechanism has a fixed "throughput", but when the available time is the same and the additional bubble volume is the same, what causes the system performance to be different?