Have you ever relied on someone or something when in hindsight you shouldn’t have? Did you make decisions or do something you normally wouldn’t have done because you thought you were covered? Whether in a personal or business context, it’s happened to all of us at some point.
The first three parts of this series explored how we have a tendency to cancel out improvements in risk mitigation measures with riskier behavior as it relates to incident notification. Is it intentional? Not really. But if we understand how and why this happens, we can take steps to prevent ourselves from falling into this trap. In previous posts, we analyzed the following three common assumptions:
• Assumption #1. “We have a notification system; that’s all we need.”
• Assumption #2. “We can reach everyone using social media.”
• Assumption #3. “Our data is fine.”
Now let’s look at our final assumption in this series that contributes to the emergency notification fallacy.
Assumption #4. “We’ve tested our plan, and we’re ready.” There’s testing, and then there’s testing. Telling people there will be a drill tomorrow at 1:00 pm is more like training; having a drill unannounced at 1 pm is testing. We need to test all components of the plan and all audiences – crisis team members, first responders, constituents, external players, and others – including how they respond to communications (technology AND content). Former FBI agent Fred Miller sets a perfect example for testing a plan thoroughly. Miller conducted a campus-wide active shooter drill in conjunction with the county sheriff SWAT team, sheriff’s department, hostage negotiators, K-9 Unit, bomb squad, community hospital, fire department, and ambulance service. His test uncovered easy-to-overlook details and unknown flaws he then corrected, preparing him for a student attack later that year.
So what does all this mean? It’s too easy to fall into the traps of overconfidence and complacency. We’ve all done it, and we all know where that road takes us. So while having an emergency notification system certainly helps us communicate better during a crisis, we must be careful not to usurp that risk reduction by being nonchalant about the people, processes, and messaging needed to make the technology effective in an emergency.
I leave you with this thought: Icarus soared high with his new wings, but had far to fall when flying too close to the sun.