Primary light battery

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

OP
3

306dive306

Contributor
Messages
139
Reaction score
55
Location
Canada
# of dives
100 - 199
What do you guys think a “fair/acceptable” life for a battery pack is?

To those of you guys who use Light Monkey, UWLD, BigBlue or Hollis primary, CANISTER, lights: how long did your battery pack last before burn time was reduced to half?

I thank you in advance for your helpfulness.

Cheers! 😎🤘
 
If you avoid storing them charged and avoid deep discharges you're looking at 1000+ cycles. So 8 to 12 years at my diving pace.

My suit heater packs actually take the most abuse.
 
If you notice I said "(load equivalent to usage)". and yea, the 3, in the .5v was missing :)

Cells at the factory are actually tested and rated to 3.4v (as I recall last I spoke with one of the engineers), but in any discharge testing I've done, the difference is so minimal and not worth it to stress a cell chemistry.

I have the West Mountain unit as well.
 
If you notice I said "(load equivalent to usage)". and yea, the 3, in the .5v was missing :)

Cells at the factory are actually tested and rated to 3.4v (as I recall last I spoke with one of the engineers), but in any discharge testing I've done, the difference is so minimal and not worth it to stress a cell chemistry.

I have the West Mountain unit as well.

I noticed, was just reiterating for others when they are sizing a load vs. just using what it is designed to be attached to.

3.4v is fine as well but like you said we are splitting hair at that point and while in a lab setting we may test to 3.4v and with the CBA you can set it to stop there, it's not something you would want to do make a habit of doing.

@Jona Silverstein I believe cell matches his packs, which as you said is not cheap, and with the packs from @Jon Nellis for his DPV's and the holders for the Silent Submersions sold by Fathom you can DIY cell match which adds to the cost and hassle, but it's at least possible.
 
Coming from RC hobbies, diving batteries/devices are all incredibly low draw, even DPVS.


Cell matching isn't going to add any performance or longevity.
A <$50 RC charger will cycle a battery and give AH readouts. I'm not sure if IR readings are valid when a BMS is hooked up but is usually an indicator of pack age/health. The chargers are made more for batteries without BMS and external balance adapters.

If you want you're overpriced scuba batts to last longer, put a storage charge on them when not being used for more than a few days. Everything should have a BMS to prevent low voltage situations now, but if you run a pack low. Disconnect/remove the batt.
If possible disconnect/remove the battery when not in use. Anything with a magnetic button/etc is still drawing the battery down over time. That circuit requires power to run(another bonus for twist lights that disconnect the batt entirely when off)
 
Actually, the more I think about it, they told me 3.0v rating, I haven't spoken to one of the engineers in too many years, may need to follow up. But even that is so fractional given the steep fall off, I've never quite understood the reasoning for using that cutoff
 
4.2v to 3v is generally the safe range of charge for li batts.
A cutoff of 3.5v is a bit high, You're leaving about 20% on the table. It is less stress on the battery. 80-20% charge is where they're happiest.
 
Coming from RC hobbies, diving batteries/devices are all incredibly low draw, even DPVS.


Cell matching isn't going to add any performance or longevity.

I have found differences in charge and discharge times, and curves when graphed. Most of the change though is much later in the discharge curve, as well as how they drop off. Now how to automate that, much less quantify it would be an interesting thing to write code to do. My old code matching RC cells back in the nicad days would take into account the voltage drop when load applied from initial discharge voltage to stable voltage, as a performance indicator, but lithiums dont do that. But rate of change in lithium's is a performance factor I've noted. Maybe in the copious amounts of free time I have :)
 

Back
Top Bottom