Skip to content ↓ | Skip to navigation ↓

When it comes to risk management, I believe one of our biggest vulnerabilities is our ability to make good risk decisions based on the information available to us.

Our decisions are often inconsistent, subjective, myopic, and sometimes ignore the obvious. Why? Human nature.

A particular interest of mine is finding repeatable methods to become more effective at evaluating, articulating, and responding to information security risk.

As a result, I read an awful lot of stuff about risk and how it’s managed in other, non-IT disciplines, and look for models and constructs that can be re-used.

I’d like to share some thought-provoking observations with you today.

Don’t Confuse Me with the Facts…

Recently, I ran across a paper called “Communication and Interpretation of Risk,” by J. Richard Eiser of the School of Psychology at the University of Exeter in Exeter, UK.  This isn’t a new paper – it was published in 1998 – and it deals completely with health and medical issues.

In spite of this being a medical paper, it was surprisingly poignant in the Risk discussions I’m engaged in right now.

For example, one of the assertions of the paper is that, when it comes to risk, we often ignore facts and apply our own subjective biases to the decisions we make.

“Risk is traditionally defined in terms of probability. However, people often have difficulty in processing statistical information and may rely instead on simplified decision rules. Decision making under risk is also critically affected by people’s subjective assessments of benefits and costs.”

I see a lot of instances in which people see data on risks, but overrule it with “gut.”  That, in and of itself, is not necessarily a problem but we need to acknowledge when this is happening so the decision doesn’t seem more “scientific” than it really is.

Is Your Emphasis Correct?

Another aspect discussed is that we often treat uncontrollable factors equally with controllable factors, when we should be focused more on things we can control:

“…a patient with a genetic predisposition for diabetes has no control over the genetic risk as such, but may influence the course and severity of any health consequences by adopting appropriate dietary and exercise habits and complying with medical advice. Statistical calculations of probability in such contexts are less important than their implications for behavior.”

How many times do we get embroiled into “edge case,” technically-focused discussions in which we spend an inordinate amount of time debating about things that just happen to us, when we could make more productive use of our time and resources by focusing on the ways we’ll reduce the impact of things we can’t control.

“I’m Slightly Above Average…”

There is also the tendency to discount risks when we are the ones who have to do something about them:

“When people are asked to judge their own risk of some disease or mishap compared with that of some average or ‘typical’ other person, a well-replicated finding is that they rate their chances of avoiding the 8 mishap as better than average.”

When one of your industry peers is compromised in a specific way, do you think you are somehow better than they are and unlikely to get hit in the same way?

If so, you may be experiencing this bias. Yes, it is possible you are better than they are at dealing with the threat, but I encourage you to question your assumptions to ensure that they are sound.

Focusing on the Rare (But Big) Rather than the Frequent (But Small)

Do you focus on solving the “big problem” vs. the “death of a thousand cuts” problem?

“When considering gains, choices that lead only rarely to gains (however large) are less likely (in the short term) to be rewarded and hence repeated than choices associated with certain benefits. Likewise, in the context of losses, riskier choices may, for a while, escape punishment and so will be less likely to be avoided than those that always result in a (smaller) loss. In other words, what appears to be a form of bias can be explained as the acquisition of expectancies based on actual experience.”

I see things that mirror this all the time, not only in day-to-day security operations but in the focus of regulations and mandates designed to “protect everyone” from security issues. Perhaps a bit more work on the things that happen all the time would serve us better in some cases.

Take a Step Back…

Those are a few of the things that made me say, “Hmmm….” from this paper, and I hope they engaged your critical thinking so you’ll be more likely to take a step back and be more conscious of human biases when evaluating risk.

There are also some items in this paper for risk communication.  I’ll share more thoughts about that in the near future.


Related Articles:


P.S. Have you met John Powers, supernatural CISO?