protecting SLA battery from over discharge

Generally I've either got smart equipment or I've used Elk's or DSC (I think that's who it was, maybe Bosch) in the past.
 
Problem is you need to know what the current limiting resistor that is installed in the unit as not all units play nice with all supplies. The Elk/Altronix white paper is because of an install I did.
 
I guess you're referring to the Elk-965:
 
61Ho8dYhIuL._SL1500_.jpg
 
This function appears to be built into the Elk M1 Gold control.
 
From the installation manual:
 
AC Failure, Low Battery, and Automatic Low Voltage Shutdown
During an AC power failure the battery automatically takes over and AC Fail trouble annunciates at the keypad.  The
communicator can be programmed to report AC Fail to the Central Station after a time delay (see Menu 12, System Option 01).
If the battery voltage falls below 11.2 VDC a Low Battery Trouble condition will occur.  The communicator can be programmed
to report Low Battery to the Central Station.  The battery will continue to run the control until its voltage drops below 10.2 VDC, at which time the control will disconnect and shut down to prevent a false alarm and damage to the battery.  The AC Fail
trouble display will clear if the AC restores.  However, the Low Battery Trouble requires a manual or automatic battery load test before it will clear.  An automatic battery load test is performed every 24 hours.   See Section 2.2 for powering up the control.
 
I'm guessing that what matters is the voltage in each cell of the battery.  I'm also guessing that each cell will have a similar but somewhat different voltage.  What's tricky is there's no way (that I know of anyway) to know the voltage of the cell with the least voltage, unless each cell had its own terminal exposed.  I've seen batteries that are literally built up from individual 2v cells wired together.  Maybe it's for testing purposes? I always assumed it would cost a lot more.  Maybe not, though, if all you do is replace individual cells when they go bad rather than the entire battery?  Or is that like replacing one tire instead of all four, as they're all likely to be similarly worn out?
 
All of which is to say: I wonder how M1 Gold picked a voltage of less than 10.2VDC as the cut-off?  RAL thought 10.5VDC was a good cut-off, broadly speaking, for a 12v SAL battery, and RAL impresses me as someone with a deep understanding of this stuff.  I was thinking of a slightly higher cut-off, to account for variations in indindividual cell voltages, but it would be redundant if RAL's 10.5VDC already did that.  I don't have a good feel for just how much practical difference, if any, it would make anyway, except that it cuts into useable mAh's if the cut-off is higher than it needs to be, but you don't gain meaningful mAh's if it's lower than it needs to be.  Therefore, a drain test plot of voltage-current might tell the tail.  That's a lot of work though.
 
Came to think of it, RAL had said to check the datasheet for the battery's manufacturer for the specific battery, as there may be differences in recommended cut-off voltages.  That does sound like the proper way to select the cut-off.  Thanks, RAL.  Maybe the 10.2V is dialed into the M1 Gold because it's of relevance to whichever battery Elk happens to recommend?
 
NeverDie said:
I'm guessing that what matters is the voltage in each cell of the battery.  I'm also guessing that each cell will have a similar but somewhat different voltage.  What's tricky is there's no way (that I know of anyway) to know the voltage of the cell with the least voltage, unless each cell had its own terminal exposed.  I've seen batteries that are literally built up from individual 2v cells wired together.  Maybe it's for testing purposes? I always assumed it would cost a lot more.  Maybe not, though, if all you do is replace individual cells when they go bad rather than the entire battery?  Or is that like replacing one tire instead of all four, as they're all likely to be similarly worn out?
 
All of which is to say: I wonder how M1 Gold picked a voltage of less than 10.2VDC as the cut-off?  RAL thought 10.5VDC was a good cut-off, broadly speaking, for a 12v SAL battery, and RAL impresses me as someone with a deep understanding of this stuff.  I was thinking of a slightly higher cut-off, to account for variations in indindividual cell voltages, but it would be redundant if RAL's 10.5VDC already did that.  I don't have a good feel for just how much practical difference, if any, it would make anyway, except that it cuts into useable mAh's if the cut-off is higher than it needs to be, but you don't gain meaningful mAh's if it's lower than it needs to be.  Therefore, a drain test plot of voltage-current might tell the tail.  That's a lot of work though.
 
Came to think of it, RAL had said to check the datasheet for the battery's manufacturer for the specific battery, as there may be differences in recommended cut-off voltages.  That does sound like the proper way to select the cut-off.  Thanks, RAL.  Maybe the 10.2V is dialed into the M1 Gold because it's of relevance to whichever battery Elk happens to recommend?
 
I don't believe that there is a magic number. Temperature, age, load and the new battery specs are all factors and the exactly correct  cutoff voltage (if there is any such thing) is a moving target. When an SLA battery gets old it can just suddenly lose a cell or two and suddenly drop far below 10.5 volts.
 
You want a cutoff that gets the most hours and minutes from the battery without risking goint to a low voltage point that the system stops performing properly. You could have a 12.0 cutoff voltage if you want to err on the side of safety, it's a judgement call. If you had a tank of fuel in your helicopter that is calculated to fly 100 miles would you dare to fly out 50 miles and feel safe that you'll make it back to the airport or would you build in a little safety factor? And how much safety factor is exactly right?
 
Mike.
 
I've been researching the cutoff voltage value a bit more. 
 
The cutoff voltage for a SLA battery varies based on a number of factors, as Mike stated.  Several battery manufacturers say 1.75V per cell is the recommended cutoff (10.5V for a 12V battery).  But if you go a little below that, it won't kill the battery outright, it will just shorten its life, depending on how far you go below the recommended value.
 
The 1.75V value is based on an assumed discharge rate.  If you discharge it at a higher rate, you can discharge to a lower voltage.  This is counter intuitive, but it's the way the chemistry works.
 
Panasonic has a graph that shows this in their technical handbook. 
 
[Edit:] Drat... Well, trying to embed the image didn't work.
 
You can see the image at the bottom of page 15 (actual page 16 in the pdf) in the Panasonic Handbook
 
So I think the reason we see the 10.2V setting on the Elk is that whoever picked the value was using a different discharge rate assumption.  10.2V would correspond to a 0.4C rate of discharge.  0.4C seems rather high to me if you want your battery to last a minimum of 4 hours. If you have a 7Ah battery in the Elk, that would mean you are drawing 2.8A, which you probably wouldn't do, even in an alarm condition.  Something around 0.15C seems more realistic if you keep the Elk's total power draw around 1A.
 
I really don't think about it that much. There's enough to think about before reinventing the wheel and frankly, a LBC is only to keep the batteries from deep cycling, not inhibit discharging. If the end user is really concerned about standby, voltage drop and discharge, do the calcs. There's a reason why they're required on fire alarm but in the same breath every burg panel install has a 7 aH installed and it's a hope for the best.
 
Most equipment has a cutoff at 10.5 VDC or so and if it doesn't, then after a deep discharge you're replacing batteries anyways unless you plan on replacing equipment. 10.5V vs. 10.2V and the concern is a moot point. You're already at and below the point where the electronics aren't going to function properly, so it's a moot point.
 
We're talking small change and small batteries here....not a rack of 100 aH or 200 aH batteries on a FACP. At most, in a residential system, I've only seen 26 aH's installed and that was close to overkill on standby time. If the standby is that much of a concern (or derating) usually a secondary UPS source is found.
 
Batteries need regular replacement in the 3-5 year range. Period. They also should have a baseline test and load testing and any battery that fails it's rating by 30% (or 70% of the listed output) then it's replacement, period.
 
Most residential alarm systems and accessories have a 12vdc nominal rating. UL requires that they operate between 80 and 110% of their rating. That is why some mfg shoot for 10.2vdc low battery cutoff. I know of panels the drop out is even lower to 9.5vdc in a few cases. Below that it can be difficult to operate as intended and may damage the battery.

But when 24 hour battery backup is required the more power available from the battery the better.

One panel i have worked om maintains a regulated 12vdc the entire 24 hour discharge period. Makes compatability issues a piece of cake. But it comes at a cost as does most good ideas.
 
Back
Top