Matthew Wald writes in the NY Times this morning that “when an oil worker told investigators on July 23 that an alarm to warn of explosive gas on the Transocean rig in the Gulf of Mexico had been intentionally disabled months before, it struck many people as reckless.
“Reckless, maybe, but not unusual. On Tuesday, the National Transportation Safety Board said that a crash last year on the Washington subway system that killed nine people had happened partly because train dispatchers had been ignoring 9,000 alarms per week. Air traffic controllers, nuclear plant operators, nurses in intensive-care units and others do the same.”
These are problems of human behavior and design in complex systems — like in a meat processing plant that collects lots of listeria samples but doesn’t act when an increase seems apparent.
If consumers and retailers have food safety recall fatigue, do producers and processors have alarm fatigue – learning to ignore rather than investigate data that may highlight a problem?
In the Maple Leaf 2008 listeria outbreak that killed 22 Canadians, an investigative review found a number of environmental samples detected listeria in the culprit plant months before the public was alerted to possible contamination and that the company failed to recognize and identify the underlying cause of a sporadic yet persistent pattern of environmental test results that were positive for Listeria spp.
Alarms and monitoring systems are established to alert humans – with all their failings – that something requires attention.
Mark R. Rosekind, a psychologist who is a member of the National Transportation Safety Board, told the Times,
“The volume of alarms desensitizes people. They learn to ignore them.”
Wald further writes,
“On the oil rig and in the Guam control tower, the operators were annoyed by false alarms, which sometimes went off in the middle of the night. At the refinery and the reactor, the operators simply did not believe that the alarms would tell them anything very important.
Wald says, “… the alarms conveyed no more urgency to these operators than the drone of a nagging spouse — or maybe the shepherd boy in Aesop’s fable, who cried “Wolf!”
So what to do? The warning systems need to be better designed delivered and continually debated throughout any organization that values a safety culture. Engineers have known this for decades when designing fail-safe systems (sic). The food sector has a lot to learn.