I adhere to the "Dive 104/108/112/130's like everyone else rule"
Anyone know the technical answer to why you hit a law of diminishing returns when filling tanks in higher pressures? I know there was an article a while back in one of the cave magazines (I think Larry Green wrote it) that explained why filling beyond 3600psi didn't gain much gas. I had just been told that gas matching a tank with 3900psi, and one with 2400psi will result in a somewhat inaccurate calculation, because the 2400psi tank will have more gas per psi? I think the difference is small enough that it's not a huge deal (I might be wrong), but I've been curious to hear a clear explanation for a while.
I don't know the details, but my understanding is that the ideal gas law(PV=nRT) that is the basis for our calculations begins to break down above certain temps and pressures. Like a lot of other scientific "laws", it's a good approximation within certain ranges, but beyond that, it starts to fail. It's like Newton's Laws being only an approximation, but very accurate at speeds small compared to the speed of light.