What are the most reliable hardwired smoke alarms?

NeverDie

Senior Member
I read that NFPA and others have run field studies and found roughly a 3% failure rate in smoke alarms per year.  So, if you installed 100 smoke alarms, you'd expect about 3 to fail in the first year, 15 to have failed by the fifth year, and 30 to have failed by the tenth year.  Allegedly, this is why 1. it's recommended to replace a smoke alarm after 10 years: there's a 30% chance it has already failed, and 2. it is used to justify testing your smoke alarms either every week or every month, depending on who you read.
 
Those failure rates seem awfully high.  Unfortunately, I don't see that manufacturers are posting solid reliability data for their smoke alarms.  Are there any reviews which do?  Which smoke alarms have the lowest failure rate?  Rather than buy a failure prone smoke detector and then obsessively test it every week, why not buy a highly reliable smoke detector and test it less often?
 
In any case, it seems strange that for a "Life Safety" critical device, longevity data (reliability over time) is so difficult to come by.  Is it even tested for?  Otherwise, the situation bears an eerie resemblance to the regulation of compact fluorescent devices, where out of the box the devices met test criteria, but real world longevity was low because the test criteria wasn't adequate.
 
Supposedly requirements for smoke alarm surge immunity go into effect for UL 217 on August 4, 2015.  Maybe part of the poor longevity stems from the historical lack of those requirements (coupled with the NPFA 72 requirements that smoke alarms be hardwired to mains in new construction), or maybe that's only a small part of it. 
 
Regardless, some smoke alarms surely have better longevity than others, so if the average failure rate across a random sampling of smoke alarms is 3% per year, there may be some product lines that perform much worse than that and others that perform much better.  So, the question we should all be asking is: which brands and models have proven out to have the best real world longevity/reliability?  Anyone know, or know where to find the answer?
 
Thanks for the link to the Consumer Reports ratings.  Luckily, I have a consumer reports account, so I was able to read through it.  For those who don't, I'll briefly recap it here: the combo CO unit reviews does help differentiate a little between the different models (based mostly on differences in sensitivity to low CO levels), but the ratings for non-CO smoke alarms does very little to differentiate between them beyond the obvious difference that a combo ion/photo unit will react more quickly to more things than either a pure ion or a pure photo unit.  Although CR does test some things (like paint) for longevity and durability, it doesn't appear that CR tests smoke alarms for long term reliability. 
 
For instance, who knows whether even pushing the test button adequately tests alarm function?  If the accuracy of that test is poor, then even if you do obsessive weekly testing as some manufacturers recommend, your smokes may still be duds.  Without longitudinal testing, how would anyone know?  Probably the only people making warranty claims are when the test button indicates a failure, so there's economic incentive for the test not to find anything more than the most blatant, obvious problems (like whether there's inadequate power to make the siren audible).
 
Any other suggestions?  Maybe I'm looking in the wrong places, but it's odd there isn't more transparency about this.  If code  requires us to buy smoke alarms and install them a certain way, it's only fair that we be allowed to know the actual quality of what we're getting, not just fresh out of the box, but over time.  If that sort of reasonable data isn't readily available, then we should all be asking ourselves: why not?  One reason might be that if companies are making high gross margins by selling cheap, lousy, smoke alarms, they don't want people to find that out.  Or, maybe they don't even want to collect the data, out of fear it might cause them to do a recall, or someone might use the information to sue them, so they have a vested interest in remaining ignorant.  On the other hand, if they're producing great alarms, they should have nothing to fear by full disclosure.  So, if the data isn't readily available, and if you agree the data is of obvious relevance for a life safety device, which do you think it is?
 
However, there's nothing to be gained by bringing that up here other than to illustrate there are prima facie reasons to have doubts.  I'd rather just know the facts, if they're available.  Are they?  Does anyone know?  Does anyone besides me even want to know?
 
Well, yes, but why go to that extreme voluntarily?
 
Let's make this more concrete.  For about the same material cost you have the choice of installing a $129 smoke alarm (e.g. the new Elk-6050) and use it for up to 10 years (or less, if it becomes glaringly apparent that it has failed sooner), or you could buy a $12 smoke alarm and replace it every year.  If all you know is that "on average" a smoke alarm has a 3% chance of failing per year, and if you aren't completely confident that the test button's test is completely thorough, the argument can be made that you're better off buying the $12 smoke alarm and replacing it every year because that way you've capped the expected probability of failure to no more than 3%.  It's not a very solid argument, but if that's all the information you have for choosing, what better argument is there?
 
Electronic components fail at various rates, and by using testing and burn-in during manufacturing, the manufacturers are able to provide several grades of components with different reliability ratings.  The most reliable components tend to be rated for 100,000 hours MTBF (mean time between failure).   Since smoke detectors need to be highly reliable, I suspect that this is what you will find in most smoke detectors.
 
Once you get past early life failures, samples of electronic devices will fail at totally random points in time. If you do the math on the 100k MTBF, this works out to about 3% per year.
 
Each year, 3% of them will fail.  But for the ones that do not fail the first year, it does not mean that they are more likely to fail the next year or the year after that.  Even after 10 years, there is still just a 3% chance that a given one will fail.
 
Think of it as being similar to flipping a coin.  If you flip the coin 10 times and it comes up heads every time, there is still exactly a 50% chance that it will come up tails on the next flip. The fact that it came up heads the first 10 times means nothing for predicting the future.
 
So if you buy a new smoke detector every year, vs keeping the same one for 10 years, it will not necessarily give you better reliability. It might even be worse due to the chance of early life failures.
 
Comparing the ELK smoke detector at $129 to the big box store model for $12 makes no sense.  The ELK detector contains 2-way wireless features that the $12 detector does not, and you won't find at that price.
 
In my opinion, you can't get away from the need for frequent testing of the smokes.  And you need to have enough smokes installed to give you adequate redundancy in the event that one does fail.  It's just the way it is.
 
From http://www.detectagas.com/uploads/File/GRI.pdf:  "The questionable performance of commercially available residential CO alarms may be attributable in part to shortcomings of the UL 2034 standard, particularly its continued omission of the requirements for quality assurance testing recommended by the Consumer Product Safety Commission (CPSC) in October 1996."
 
Granted, this is about the questionable performance of CO alarms, not smoke alarms. In that study, 3 out of 10 commercially available brands that were tested alarmed within UL 2034 specification, and 7 out of 10 didn't. I'm not sure if the same type of issue applies to smoke alarms under UL 217, but it illustrates  the type of issue I mean. Also, some brands/models performed better than others.  I'd like to know which were which.
 
Yes, I could use test smoke, but it seems like a crude test of whether the smoke alarm works at all, not how well it is working, and whether it is working within spec.  It should alarm at a certain threshold, and I'm doubtful I can test that without a fairly advanced setup like the instrumented smoke boxes that UL uses.  Also, by then, I will have already purchased the smoke alarm, so it's after-the-fact.  I may need 10 of them, so it adds up.  If some perform better than others, I'd prefer to start by buying one of the best, but I don't know of a way to gauge which is which other than scrutinizing reviews on amazon and trying (possibly in vain) to sort it that way.
 
Looked at another way: if the test buttons can be trusted, why is anyone buying cans of test smoke when they could be just pushing a button instead?  I guess not everyone trusts the test button is an accurate test.  All that said, I've never actually tried a can of smoke, so maybe I'm selling it short.  Has anyone here tried one?  I suppose I could maybe get a rough sense as to how well the smoke alarm is working, at least for the particle size emitted from the test can.
 
The test buttons don't truly test the sensor element.  Rather, they are testing whether the detector will sound an alarm if the sensor ever detects something.  Think of it more as a partial, functional test, and not a true I-see-smoke test.  That's why it's a good idea to use the canned smoke, as it will test the sensor as well.  True, it won't tell you whether it is within specs or not, but it's the best you can do in the field.
 
I don't think there is any validity in trying to use the results of the CO detector reliability test as a guideline for smokes.   They are two completely different technologies.   Smoke detection is a pretty mature technology after all these years.  It wouldn't surprise me if just a few companies make the detection elements used in all the different brands of smoke detectors. If there was a significant difference between the brands, I think you would see someone like NIST, the NFPA, the CPSC or Consumer Reports making an issue out of it.
 
Makes sense, but If that's the case, I wonder why the manufacturers like FirstAlert/BRK and Kidee/Firex are only recommending pushing their test button?  I haven't seen either recommend doing a smoke test.  At least, after looking at a few manuals, I haven't noticed them recommending it.  Plainly, you want to know if the sensor is working, as it might have gotten dust in it, or it might have failed for other reasons, and you wouldn't know that from just doing an audio test, as you point out.
 
The supervised smoke detectors seem to do a better job of monitoring their own sensitivity, and possibly even recommend when you should clean them based on the self diagnostic.  I'm not sure whether consumer grade smoke alarms self diagnose, or whether they are designed to false alarm instead as their way of getting maintenance attention if it's needed and the homeowner has been slow to provide it, or whether they simply fail in unpredictable ways because the homeowner wasn't obsessive enough about testing them.  I'd be very curious to know, as the supervised self diagnostics might be worth the hassle of converting to the type of alarm panel system that can supervise them.
 
For instance, the Elk-6050 is a supervised smoke alarm (i.e. includes both smoke detector and sounder), and if I'm not mistaken, it also offers a wireless interconnect via its two way radio that will meet the NPFA 72 requirements for new home construction.  Thus, you can inspect the supervisory information through the panel, or be alerted by the panel using whatever rules you configured based on the supervisory status of each smoke.
 
The Nest Protect is an interesting middle ground, because it offers supervision by other means, without a panel per se, but also relies on wireless interconnect to meet the NPFA 72 requirements for new construction.  In a similar vein, Fibaro has a pretty cool interconnected and supervised smoke detector/alarm as well.  In a number of ways, it's more awesome than the Nest Protect.  They have a US version that's not on sale in the US yet, but I estimate (through straight currency conversion of the selling price in Europe) that it will cost about $100US.
 
Any thoughts on any of those three?
 
BRK's OneLink also offers a wireless interconnect, and through an insteon smoke bridge a way to query the smoke alarms and their status (battery hi-low and whether the alarm is triggered or not), but I don't see that it has other interesting supervisory information like, for instance, CleanMe notification based on self-diagnostics of sensor sensitivity.  If I'm wrong about that, I'd be interested to know, because its cost is also quite a bit less, and it may be an easier retrofit.
 
I'm interested in the self diagnostic sensitivity info, because any insight one can get into the health of the sensor, especially automatically and without obsessive testing, seems worthwhile and possibly labor saving as well.
 
I think the smoke detector manufacturers don't recommend using canned smoke because they have enough history with their products to believe that if the recommended maintenance is followed, the chances of a failure are extremely small.
 
Optical smoke detectors tend to be somewhat of a failsafe design.  In normal operation, an infrared beam of light is detected, and if smoke obscures it, then the alarm sounds.  The main enemy will be dirt and dust, and if these obscure the beam, a false alarm sounds.   There's always a small chance that the IR beam detector will fail in a stuck-on state, which is why it's still necessary to test them.
 
With an ionizing detector, I suspect that there is always some background level of ionization.  This would be a pretty good indication that the detector element is working and a good detector design would sound a false alarm if the background level ever dropped below normal levels.  Again, there's a chance of a failure mode where this might not be reported, so that's why you still do the tests.
 
Maybe the manufacturers are comfortable enough with the designs that they don't need to recommend the canned smoke, but it gives me a much better feeling that the detector really works end-to-end. But maybe I'm more paranoid than necessary.  :)   I don't think canned smoke is necessary once a month, but once a year seems like a good middle ground.
 
RAL said:
Electronic components fail at various rates, and by using testing and burn-in during manufacturing, the manufacturers are able to provide several grades of components with different reliability ratings.  The most reliable components tend to be rated for 100,000 hours MTBF (mean time between failure).   Since smoke detectors need to be highly reliable, I suspect that this is what you will find in most smoke detectors.
 
Once you get past early life failures, samples of electronic devices will fail at totally random points in time. If you do the math on the 100k MTBF, this works out to about 3% per year.
 
Each year, 3% of them will fail.  But for the ones that do not fail the first year, it does not mean that they are more likely to fail the next year or the year after that.  Even after 10 years, there is still just a 3% chance that a given one will fail.
 
Some additional thoughts about my earlier comments.  Failure rates are rarely as simple as I described.  There are some components that have pretty constant failure rates throughout their lives.   But many parts have a higher rate during the first few months, then fall to a lower level and stay there for a long time (years), and then the rate increases late in life.  The MTBF number just tells you at what point 50% of them will have failed, without telling you what the year to year profile looks like.   So, you might have a component that has a 8% failure rate the first year, then drops to 1% for the next 9 years, then increases to 20% for the remaining years.
 
What I said earlier still holds, though. Even though the failure rate later in life might increase, it still doesn't tell you whether a particular unit is on the verge of failing.
 
But a component can also change in value with age, without totally failing, and that could cause a smoke detector not to work as well as it should.  The aging of components could cause the smoke detector to be less sensitive than it was when it was new.  It might still pass a test, but it might not respond to a fire as quickly as it should.
 
I think it is probably a combination of these two issues which results in the recommendations to replace smoke detectors after 10 years.
 
I don't have a current copy of UL217 to review, but I did find a 1993 version online, and it seems clear that UL is the one who specifies what reliability targets must be met in order to pass (http://archive.org/stream/gov.law.ul.217.1993/ul.217.1993_djvu.txt):
 
4 Detector Reliability Prediction
4.1 Detector units shall be designed to a maximum failure rate of 4.0 failures per million hours as
calculated by a full part stress analysis prediction as described in Section 2.0 of MIL-HDBK 217B
(20 September 1974) or 3.5 failures per million hours as calculated by a simplified parts count reliability
prediction as described in Section 3.0 of MIL-HDBK 21 7B, or equivalent. A "Ground Fixed" (GF) environment
is to be used for all calculations. If actual equivalent data is available from the manufacturer, it may be used
in lieu of the projected data for the purpose of determining acceptable reliability.
 
That seems like a much lower failure rate than what the few number of field studies done have reported.  In any event, for a device as simple as a smoke alarm, it really should be rock solid.  I don't see anywhere in the document that it requires tracking of defects in actual manufactured products.  However it happens, some smoke alarms have been recalled subsequent to passing UL217, and the desire for a product to get a UL217 listing appears to be the only control over what gets manufactured.  i.e. Just passing UL217 seems to mean that reliability is theoretically possible, but apparently a lot can happen afterward (outside the purview of UL) to interfere with reaching UL's reliability targets.
 
Maybe European standards and processes are different, in which case maybe getting a smoke alarm from there that meets EU standards and also meets UL217 would be more reliable, but that's purely conjecture on my part and probably nothing more than wishful thinking.  
 
4 failures per million hours sounds very small, until you think of it this way...
 
100,000 hours is 11.4 years.  So if you had 10 smoke detectors installed, over an 11.4 year period, you'd have accumulated a total of 1 million hours of time.  And 4 of the 10 detectors, or 40% of them, could fail and you'd be meeting the UL217 requirements. 
 
Back
Top