Skip to content ↓ | Skip to navigation ↓

Alarms and alerts surround us every day. From the moment our clocks wake us up in the morning, we rely on alarms for many things. But what happens when those alarms and alerts malfunction? What does it do to us and how does that affect our day to day life? Recall the Dallas Emergency Alert Malfunction.

As it turns out, getting tired of these alarms can prove dangerous to cybersecurity.

A few years ago, Nick was traveling through Newark airport in New Jersey. All of a sudden, the airport alarm system started going off. He stopped and looked around as everyone just paused for a moment, stared at one another, then went along their way. In just a few moments, the alarm became an annoyance – not a sign of any real danger.

Several years back, however, he was at the LAX airport during a TSA-involved shooting, so the alarm panicked him. Nick ran up to the closest TSA agent and asked what was going on; it’s not often you hear a global alert system go off. The agent’s response to his question was, “I don’t know,” and they didn’t seem concerned to find out. Different rant for a different day.

The point is, we’ve all experienced false alarms in our lives just like this one. Fire alarms go off by accident in our workplace or college dorm. Ocean safety authorities release false tsunami alarm after false tsunami alarm. When we first hear these alerts, we’re likely filled with panic. But the more these alerts falsely sound, the more our panic diminishes.

Much like the villagers in the “Boy Who Cried Wolf” story, we’ve become immune to what would otherwise be a sign of real danger. This is known as alarm fatigue. Each time we hear false alarms, we’re being desensitized to the stimulus, whether consciously or not. Our attention gets shorter and our reaction time (e.g., leaving the building) gets longer.

In our increasingly-connected world, digital alarms surround employees every day. Depending on the employee’s role, they may even rely on alarm systems to perform their jobs. Take, for example, a security operation center (SOC) and the importance of alarms in that environment. False positives aren’t just an annoyance – they can harm an organization when responses to security events are delayed or non-existent.

On the other hand, this still applies outside of SOCs and other settings in which security employees work. We have employees that are not security experts, (In fact, most of them probably aren’t.) so we use alerts and alarms to keep them informed about security risks. Right after a major, newsworthy cyber event, for instance, many IT departments might send an email saying, “Watch out for phishing emails! Stay alert, and help protect our clients’ data.”

Phishing awareness training is obviously important, and framing security in ways that employees can understand (e.g., protecting customer interests) is critical, as well. However, companies and governments are constantly experiencing data breaches and other cyberattacks, which means these warning emails are sent quite frequently. Considering what we now know about alarm fatigue, do we really believe this works to prevent phishing?

The same idea applies when security incidents occur at an organization. While we certainly should educate our employees and empower them to be more cyber-secure, we also shouldn’t overload them.

“Malware launched against company networks – don’t open an email from humanresources!”

“An employee’s credentials were recently stolen. Enable two-factor authentication on all of your accounts!”

And so on.

These alerts are trying to help, but if employees tune out as soon as they hear “cyber” or “encryption” or “cybersecurity,” are they really going to pay attention when they receive an overload of these emails? More specifically, how quickly is alarm fatigue going to set in if employees receive constant reminders and alerts?

While the answer is by no means simple, this is just another reason to make cybersecurity “for the human.” We have to study how cognitive biases impair human decision-making and then design security training with that in mind. We have to fight the “scariness” of cybersecurity so employees will actually read and understand security alerts. And we have to build an internal alert system within our employees – one that becomes instinctual behavior – rather than just relying on beeps, dings, and pop-ups from software programs.

If we are to better prepare security professionals and non-professionals alike to face the complex landscape of threats, we need to recognize, study, and design around alert fatigue.


nick santoraAbout the Authors:

Nick Santora is the CEO of Curricula, a cyber security education company located in Atlanta, GA. Curricula is an innovative story based cyber security awareness training and phishing simulation platform. You can follow Curricula on Twitter @Curricula or check out their website at

Justin Sherman is a student at Duke University double-majoring in Computer Science and Political Science, focusing on all things cyber. He conducts technical security research through Duke’s Computer Science Department; he conducts technology policy research through Duke’s Sanford School of Public Policy; and he’s a cybersecurity contributor for theJustin ShermanPublicSectorDigest. Justin is certified in cybersecurity policy, corporate cybersecurity management, social engineering, infrastructure protection, insider threat prevention, and homeland security planning from such organizations as FEMA, the National Institutes of Health, the U.S. Department of Homeland Security, and the U.S. Department of Defense.

Editor’s Note: The opinions expressed in this guest author article are solely those of the contributor, and do not necessarily reflect those of Tripwire, Inc.