CD1109-alarms

Turn Off That Alarm!

Aug. 31, 2011
For Any Automated Equipment, Alarms Are Rarely a Good Thing. For a System Designer, an Alarm Is Simply the Easy Way Out
By Jack Chopper
Jack Chopper is chief electrical engineer at Filamatic (www.filamatic.com) in Baltimore.
I've often been accused of using human traits to describe machine controls, and every time someone mentions it, I chuckle to myself. The simple truth is that I find it easier to describe so-called "technical things" by using human behaviors as a reference. This provides a level field, making discussions easier and open. This tendency of mine exists because of the way I view control systems. I see the controls from the perspective of how I personally want to interact with them, rather than how a particular machine or system needs to operate. I see alarms through that same lens.

For any automated equipment, alarms are rarely a good thing. The presence of an alarm is the control system's way to get someone's attention, often because the system is unable to accomplish what it's trying to do. The alarm also could be a notification that there's something the system needs and is unable to provide.

So, an alarm alerts someone that the control system either wants a person to be aware of something, or that the control system requires human intervention.

There are times, though, that an alarm is simply the easy way out for a system designer. Once the alarm is annunciated, the control system doesn't need to do anything else, since a human is expected to intervene. So the designer's job—for that facet of the system or function—is complete. In all fairness, sometimes there are simply no alternatives and an alarm is the only reasonable or safe thing to do.

The problem isn't with alarms themselves. It's with the human conditioning that accompanies alarm events. The more often alarms occur, the more likely they are to be ignored (or disabled, if possible). Efforts to minimize the number of alarms is one of the places where logic diagrams or flowcharts really shine. If you study your documentation with the specific goal of eliminating unnecessary alarms, your control systems are almost certain to become more robust. The goal is to design systems so that alarms are rare events, and those events have consequences. Nothing about an alarm should be routine. If the event is routine, it shouldn't be annunciated as an alarm.

Sounds obvious, doesn't it?

[pullquote]Most experienced writers review their work several times with the specific goal of removing words. This effort makes the piece better because using fewer words provides clarity. Less clutter is a good thing. This same strategy applies to control systems and, in particular, to interactions with the operator. Let the operator interact all he wants, but limit as best you can the number of times the system forces the operator into such interactions.

Alarms generally require the operator to interact. If you were to assume the operator is always busy adding value somewhere else, you might design your control systems differently, especially regarding alarm events. With this assumption, every single event requiring operator interaction or attention serves to reduce overall effectiveness.

I've seen interfaces for which every alarm required manual acknowledgement—including those alarms for which the conditions that originally triggered it had long since been satisfied. Must every alarm be manually acknowledged? I think not.

I could try to list some general rules applying to alarms, but the truth is that this is an area where one size definitely does not fit all. I will offer, though, that the general thrust should be to use alarms as a last resort, rather than as an easy fix to a pesky controls problem. In some cases, looking at the chain of events that cause each alarm to be triggered will present obvious opportunities to reduce the incidence of occurrence. Similarly, an informal failure mode and effects analysis (FMEA) often will add similar insight. Conduct these exercises before the system is commissioned and you'll likely hear comments you wouldn't hear otherwise. Many of those comments are opportunities to build better systems.

Better systems tend to work with—rather than arm wrestle—the operator. As an added bonus, better systems tend to have more credibility. The better systems are the ones that operators tend to trust more, especially when alarms are concerned.

Isn't that what we wanted from the start?