Rebreather Question

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

@joshk sorry for engineer brain turning on.... That's how I look at all of it though. The computer has to turn the cells into y=mx+b and bounds it with our calibration points. It has to assume that outside of those points, the line holds true, which we unfortunately are aware that it does not.

There should be no offset b value since 0% 0mV is used as one calibration point. And repeated one more time: calibrating with single gas is already a two point calibration since the second calibration point is 0% 0mV. With two gases you can make a 3-point calibration.
 
There should be no offset b value since 0% 0mV is used as one calibration point. And repeated one more time: calibrating with single gas is already a two point calibration since the second calibration point is 0% 0mV. With two gases you can make a 3-point calibration.

My meg does 2 point calibration with air and O2 so there is a +b. The Predator only does O2, and you can see the difference in displayed ppO2's because of it. The farther you get from 1.0, the larger the delta in displayed ppO2's between the Predator and the controller.


Your diluent PPO2 needs to be within the range of accuracy of your system and if your target PPO2 is outside that range then you can’t run that target PPO2.

so you never run a target ppO2 that is above 1.0?
Your system is not considered accurate above 1.0 because it is outside the calibration range. That was the whole original point. If I have to choose for 2 point calibration between .21-1.0 or 1.0-1.6 then I will choose 1.0-1.6 every time. I will accept the inaccuracy below 1.0 because the risks to me are inconsequential compared to the risks of inaccurate readings at 1.6
 
Your system is not considered accurate above 1.0 because it is outside the calibration range.

You don't verify above 1.0 based on what it was reading on the surface with air and oxygen?

That's my point.

Inaccuracies below 1.0 do matter; the range of accuracy has to include your diluent PPO2.

That was the whole original point. If I have to choose for 2 point calibration between .21-1.0 or 1.0-1.6 then I will choose 1.0-1.6 every time. I will accept the inaccuracy below 1.0 because the risks to me are inconsequential compared to the risks of inaccurate readings at 1.6

:poke:

I've been trying to stick to specifically what I disagree with. I've either communicated that very poorly or you truly believe the handwavy hypothetical multipoint quadratic calibration under pressure BS really makes your argument and I'm not willing to spend the time to understand it. My loss, I guess.
 
@joshk I do verify, but despite verification, it is still considered inaccurate because it is outside the bounds of the calibration points. It is a technicality, but it is still there, and unfortunately because of the way that galvanic cells start to deteriorate, the linearity is often not the same at the top as it is at the bottom. That was my only point on that.
 
What standards says the calibration points for a instrument must be at the limits of the calibration range?
NIST, ASTM, OSHA, ISA, pretty sure all of them.

Two different terms that are important. One is instrument range. In the case of cells, that is 0mv, to whatever limit they are current limited. The other is calibration range, which is whatever range the instrument is calibrated to.
Outside of the calibration range, the values are not considered to be accurate because they are outside of the calibration range.
 
Outside of the calibration range, the values are not considered to be accurate because they are outside of the calibration range.

Yeah, I get this. But what standard says that the calibration points must be at the limits of that range? e.g. a 0-1" mic calibrated with 0.25", 0.5", 0.75" blocks. The range is 0-1" not 0.25"-0.75"
 
Yeah, I get this. But what standard says that the calibration points must be at the limits of that range? e.g. a 0-1" mic calibrated with 0.25", 0.5", 0.75" blocks. The range is 0-1" not 0.25"-0.75"

The instrument range for that would be 0-1", the calibration range would be .25"-.75" and anything outside of that range would not be considered accurate because you can't verify it.
The Meg may actually be considered 3-point if it draws one line from 0-.21 based on 0mv=0ppO2, and then draw a second line from .21-1.0, in which case it is accurate from ppO2's of 0-1.0, but anything above that, while within the instrument range, is not within the calibration range.
That is where linearity starts to become very critical because there is no way to calibrate. Easy math says that 60mV=1.0ppO2, so when you do your 1.6ppO2 check, you should see 96mV. Unfortunately the cell only shows 86mV so it is 90% linear, so even though you O2 flushed at 20ft and know that the loop is pure O2, the handset only shows a ppO2 of 1.43 even though the loop is actually at 1.6. If the calibration was able to be done at 1.6, then it would have drawn the line between 1.0 and 1.6 and it would know that 86mV=1.6ppO2.
If it only drew that line, then if you dil flushed and the ppO2 was supposed to be .4, it should read .45, and in air it should read .23 because of the linearity compensation. In this case, the instrument range is still ppO2=0 to ppO2=1.6, but the calibration range is 1.0 to 1.6 so you will have inaccuracies outside of that calibration range.
Ideal situation is a unit that calibrates at .21, 1.0 and 1.7 and draws the 4 lines to give an actually accurate ppO2 reading.

That make sense?
 
If I understand this right then my current CCR procedure is actually out of calibration range. (on shearwater) I manually flush O2 on 6m to crosscheck against current limitations, but this is not conducted by the controller and assumes the cells were linear.

Following this logic, the only (to my best knowledge) CCR operating within calibration range then was then the Se7en which does a 'oxygen linerarity test' and calibrates the cells onto PO2 of 1.6 at beginning of the dive.

---or am I lost now?
 
@NAND I don't know how the Se7en calibrates, but if it is conducting a linearity test and noting that at 1.6 during the beginning of the dive, then it is the only unit to my knowledge that does that.

The linearity check is done by the diver and you essentially "calibrate" the unit on your slate. That's why you have to know mV's as well as what the unit claims your ppO2 is.

I.e. I'll validate 1.6 on the descent. If it only says 1.52, then I am 95% linear, and I know that when I am trying to maintain a 1.6 setpoint on say deco, then I am to stop when the unit tells me 1.52. If I want to run my setpoint at say 1.3, I need to know that it is actually going to be a 1.38 setpoint and factor my CNS clock and/or deco accordingly. I wouldn't necessarily change the deco side because it adds conservatism, but you need to be aware of the CNS factor.

Now being obviously new at this, I will calibrate on the surface, write down the mV's that show at 1.00 when it calibrates, and then I have 3 tables in my wetnotes that I fill out after calibration and before the dive. Rows are 1.0 thru 1.6 in .1 increments and columns are expected mV and displayed ppO2.
Table 1 is 100%, 2 is 95%, and 3 is 90% for linearity.
When I go down and check my ppO2 at 20ft and am looking for 1.6, I'll note what ppO2 it spits out at me, check the mV's, and I'll use whichever table is closest.
 
https://www.shearwater.com/products/peregrine/

Back
Top Bottom