Skip to content ↓ | Skip to navigation ↓

In a recent survey on the state of risk based security management, we asked leading information security and risk management professionals in the US and UK a simple question:

Is information security risk management an art or a science?

In answering this question, we asked them to consider “art” as being analysis and decision-making that is based on intuition, expertise and a holistic view of the organization, where “science” refers specifically to risk analysis and decision making based on objective, quantitative measures.

The results of the survey are shown here:

seca

What is your informed opinion? Is information security risk management more of an art or is it purely a science?

Click here to take survey

Expert Opinions

As you might expect, the survey reveals that risk management is not considered purely art, and it’s not considered to be purely science either.

There are many models and key performance indicators available that allow us to apply scientific elements to infosec risk management, but those activities do not alone make it a science.

On the other hand, there will always be a need for risk management and security professionals to leverage their own experience and knowledge to make determinations that are based on their own background and particular circumstances – the “art” part of the equation.

So we decided to ask some risk management “heavy hitters” for their perspective on this topic. Here are some important take-aways from professionals who are constantly striving to improve the practice of risk management:

Infosec Risk Management is Neither an Art nor a Science

picRisk management requires a balance of both art and science. Jack Jones, author and creator of the Factor Analysis of Information Risk (FAIR) framework and a CISO himself, points out that when we gather data and use models (the “science” part), we tend to make subjective interpretations based on our personal biases and perspectives (the “art” part).

A good way of representing how a balance is achieved is demonstrated in this analogy shared by Jones:

If you’re approaching an intersection in your car and the light turns yellow, do you choose to stop or to speed up to beat the red light?  It depends, of course.  In the instant that you see the light turn yellow, you take into consideration objective data such as your speed, how close you are to the approaching intersection, the road conditions, the condition of your vehicle, traffic, and whether there’s a police officer on the corner, etc. You then add your understanding of their relationships to one another (your mental model) to generate an understanding of the risk.  Then you make a very personal and subjective choice as to whether to stop or accelerate.  

In risk management, we get the best data we can, apply it against whatever model we’ve chosen to use for evaluating risk (i.e. FAIR), and then an executive or other decision-maker interprets the relevance of our analysis within the context of their organization’s priorities.

Art in this debate represents the purely visceral approach of the experienced professional, intuitively sculpting truth and beauty where none may have existed. Science in this debate represents the unemotional and objective approach of an anaytical mind: Getting something “down to a science” implies deep understanding and repeatability.

Nothing could be further from reality since those two extremes do not exist in the real world, noted Jay Jacobs, a senior data analyst on the Verizon RISK team and a co-author of the Verizon Data Breach Investigations Report.

Jacobs sees virtually no line between the art and the science. The main difference is that artists are infallible in that art is an expression of the opinion of the artist, whereas scientists begin with the assumption that our opinion is fallible (especially when dealing with complexity) and so they seek feedback and evidence as the basis for improving our understanding of an event.

The scientific approach typically wins out precisely because we are dealing with complex systems, and because we need to reduce our level of uncertainty.

picAll science has a degree of art to it, and all art has a degree of science as well, says Ben Tomhave, a board member for the Society of Information Risk Analysis.

Tomhave believes there will always be an inherently artistic attribute to risk management.

For example, science often flourishes by adding an artistic flair even if we’re simply building stunning visualizations from a big pile of tasty data, and we should always ensure that we have the best data and analysis available for making the best decisions possible (“commercially reasonable and legally defensible” that is).

“I think my co-blogger and co-presenter Alex Hutton said it best, when he called risk a psuedo-science,” said David Mortman. “That is to say, we may be beginning something significant, but its not codified into a set of known systems that work.”

You can see a copy of their presentation that was utilized at BlackHat and BSides Las Vegas.

Bob Rudis, director of enterprise security and IT risk management for a global financial services organization, views infosec risk management as being more “alchemy” than as pure art or pure science.

The term alchemy was used in reference to Hutton’s and Mortman’s position, having posited that the current state of infosec risk management is that of a “proto-science” [editor’s note: modified from ‘pseud0-science’ to ‘proto-science’ for accuracy].

Rudis acknowledges that the majority of information risk practices he comes across seem to best resemble mythical incantations recited over a cauldron of esoteric formulas and components, or at worst a divining rod approach.

Infosec Risk Management Requires Mastery

picIn the book Outliers: The Story of Success, Malcolm Gladwell sets out to identify the shared traits of highly successful people. One common theme in the book is the “10,000-Hour Rule,” in which he asserts that in order to be great at what you do, you need to invest an enormous amount of time so that you can become “unconsciously competent.”

This notion is not foreign to Hutton, director of operations and technology risk for a financial institution, who encourages us to embrace the Japanese concept of “shokunin” which has a similar meaning as the word “craftsmanship.”

The Japanese apprentice is taught that shokunin means not only having technical skills, but that it also implies a particular attitude and social consciousness, and a “mastery” of ones profession.

“So for me, I very much see an evidence-based perspective, quantitative modeling techniques, and the application of scientific method as the means to evolve and master my profession, as well as help me lead the optimal risk team,” Hutton said.

Show Me The People!

Donn Parker, who has written many books on computer security and whose thoughts on risk assessment are well known to those in the industry, believes that information security is really a trade and professional practice, and that ‘risky’ situations should be decided by a management fiat, not by science.

He sees risk-based security as a way to introduce uncertainty and forecast future events with little or no valid known experience.

pic”It is unnecessary, wrong and dangerous to attempt to chose, justify and prioritize effective security solutions by forecasting the frequency and impact of adversity outcomes when dealing with unknown intelligent adversaries and unknown vulnerabilities in rapidly changing and complex IT,” Parker pointed out.

Parker’s contributions to this industry are many, and often he presents seemingly contrarian views. A few years ago I remember attending a very spirited discussion on risk management with David Mortman, Alex Hutton, Andy Ellis and Allison Miller where the panelists shared their views on what risk management practices are more effective for security professionals.

“Much of the security of an enterprise depends on well-known generally accepted controls and practices and a common body of knowledge,” said Parker.

Risk management is draws from your personal experiences, and as Tripwire CTO Dwayne Melancon puts it, “It’s what you feel in your gut.”

Melancon sees risk management as a way to apply science to an emotional problem: How much risk can be tolerated in a given situation, and what be done about the areas where the risk has become unacceptable.

However, it could be argued that many organizations do not possess a clear understanding of their own risk tolerance. Thus although one can agree with Melancon’s statement that risk is based on what you “feel”, so there needs to be more science in the process to allow individuals to assess situations with less personal bias and make decisions that are consistent with the organization’s overall appetite for risk.

Melancon also believes that risk management has some of the attributes of a science in that it enables us to explicitly declare the factors we will evaluate along with the measures for scoring those criteria.  We can then come up with more objective answers that are consistent with what we feel in our gut, as well as enabling the comparative ranking of different risks that are based on similar factors.

pic“Risk management, or more generally management, is fundamentally an art, said Dan Arista, a researcher focusing on Cybersecurity Risk Management.

That said, wherever we talk about risk, or where “future outcomes” matter, we want our predictions to be guided by evidence.

The best kind is ‘scientific’ evidence, which requires the ability to experiment, observe and measure phenomena, isolate constants and variables, which may need to be controlled.

However, we are plagued with a lack of known anchoring constants, difficulty in reproducing field conditions in controlled experiments, and the overwhelming scale and complexity of cyberspace.

The result is that risk analysts have had little success developing a cohesive set of theorems corresponding to the phenomena under examination. It’s difficult to make valid inferences about what will happen if a particular configuration is adopted.  This imprecision in predictability frustrates efforts at making it a science.

“We still don’t have a ‘theory’ of cybersecurity,” said Arista. Even if we had good predictive models to employ, answering the over arching question of how much risk is ‘reasonable’ remains an open issue.

“When the down-side of risk can be externalized, the costs of security may not be economically rational. As long as we’re deploying IT, there will be some risk to take. All the focus has been on predictive models, the ‘normative’ models have been neglected”, said Arista.

The majority of what managers are burdened with is the exercise of discretion – particularly because there are still many trade-offs, which elude quantification when strategizing on how to defend against an ever-changing sentient threat.

We’re Closer to Art and Need More Science

picJones insists that risk management has been severely skewed toward art, and that we need to focus more on science to bring back the balance, while Tomhave doesn’t necessarily believe that we’re closer to art so much as we function with an elementary understanding of data, statistics, and analysis.

In order to mature, we need to develop and evolve our understanding of statistics, statistical modeling, data sampling, data gathering, and more. All should be required courses in higher education.

Tomhave also points out that it’s important to figure out how to condense and fast track the science part of the equation (decision sciences, ops research and overall statistical methods) to provide a broader, more immediate benefit to people and industries.

Rudis shared that we need to continue to pursue greater visibility into IT management practices, and a greater commitment to aligning threats and vulnerabilities to actual business losses with repeatable processes.

Progress is being made towards infosec risk management becoming a science, but there is more than a sliver of artisanship involved in the more mature risk disciplines that we could benefit from.

Jacobs admits that risk management is still more of an art, and that we need more science in order to grow and mature as an industry.  Science brings with it a level of rigor and feedback that is sorely missing in most infosec risk management programs.

picThe shift towards science doesn’t have to be difficult or complicated since it is a mindset above anything else.

More importantly, Jacobs believes we need to pursue risk analysis as a science so that the next generation of analysts won’t also be squinting one eye at complexity and pronouncing “medium risk”.

Although it’s unlikely that we’ll ever fully automate decisions that rely on data and analysis, Tomhave believes that we’re finally learning how to collect good quality data, learning how to apply statistical methods to analyzing it, and that we’re learning how to visually present the derived findings.

He believes we should seek to constantly improve science, but we should trust that we will also get better at making decisions through intuition and natural assimilation of the facts.

Risk is on its way to becoming a science, but it has a long way to go, according to Mortman. In order to get there, we need to find those systems that work, and this means building models, testing them, and then refining them. This will take literally thousands of iterations (and probably a lot more), but eventually we’ll get to a science.

Advice and Tips

Jacobs encourages us to try this simple exercise with your next risk analysis: Start with the conclusion and think up an opposite situation (think of high risk as low), and then think of things you could observe, measure or collect that would disprove the new statement (a null hypothesis).

This exercise will result in one of two outcomes: 1) Science happens, or 2) We can’t prove or disprove our initial statement with evidence, and we should wonder how much confidence we should have in our initial conclusion. If we are paying attention, the latter outcome should lead to many more questions.

picIn conclusion, different organizations practice infosec risk management in different ways depending on their level of maturity and expertise.

As Tim Erlin, Director of Product Management at Tripwire mentions, the important question to ask is which approach is more effective – and for that we need good metrics to actually measure success.

That, is a great idea for another post… How do you measure if you’re being successful in your infosec risk management efforts? If you would like to contribute, comment below and ping me.

Hasta pronto!

@cindyv

 

P.S. Have you met John Powers, supernatural CISO?

Images courtesy of ShutterStock

Hacking Point of Sale
  • The Architect

    Good article. The title reminded me of a tee shirt (from the previous millennium) from Primavera with the caption:

    "Project management; art, science or b*ll sh*t

    • LOL — why didn't I think of that title! And glad you enjoyed the article.

  • Phil Agcaoili

    Standards in our industry provide us a framework and a strategy to develop defensive measures on predictable events/scenarios. Standards applied along with risk management has allowed many in the private sector to invest in security at the level that their business are comfortable to remain competitive. Holistically companies apply some form of risk management. COSO for example applies 4 objective categories of risk–Strategic, Financial, Operational, and Compliance. Risk introduced through information security, cyber security, data protection, service reliability/resilience, and privacy are embedded in each objective category.

    Some of the science of risk management lies in the evidence that we gather and deduce based on key performance indicators, key risk indicators, and performance goals and metrics. Some of the art of risk management lies in the experience, deductive reasoning, and opinions based on surveys, assessments, benchmarks, and maturity models. Both art and science help us formulate and apply risk management methodologies to prioritize and manage our finite, but precious resources.

    So why does risk management for the Information Security industry compromise of 50% science and 50% art? When we add humans to the equation, the variables in the risk model equations become much more unpredictable. We start dealing with outlier scenarios that cannot fundamentally be addressed by most businesses. We’d go out of business if we did and our executives would simply stop listening to us. Security leaders have a history of being alarmists.

    So science meets art. Successful security leaders use this to communicate the story to help inform their executives on what to do next. In the end, the business decides on how to react and how to be proactive—where to invest in security.

    This is why I believe that the US government's approach to cybersecurity is wrong. Applying the "All Hazards" approach to security may demand that businesses invest in securing black swan events. No one that wants to stay competitive can invest this way. I guess the government can when they have infinite funds and are worried about every scenario. Our industry and the government has a lot to learn from the insurance industry.

    * The views expressed are my own and not that of my employer.

    • Nicely put Phil – you hit on some key minutia that takes the conversation to another level. I agree in that we have all seen"bad science" and its negative impact on the subject matter it attempts to explore. good science is an art – in approach, methodology, interpretation and application. Strong, repeatable and consistent science is achieved by an experienced intellect with the instinct to make the next great leap to advance our collective knowledge.

  • Clemens

    thanks

  • Paige Hawin

    A fantastic post Cindy! Really informative but interesting at the same time. I think it is almost the human worker aspect that is part of the risk. There is always a sense of unpredictability and surprise and so an element of ‘art’ is almost unavoidable. Previous claims are studied so that those same risks can be avoided. It is based on experience and opinion which I think definitely is ‘art aspect’ that is not only needed but is inevitable. However, an element of science is equally crucial and helps us develop strategies and is all based on the technicalities and evidence. Without one another, the system of risk management would collapse.

    • Good points Paige – thanks for your insights!

  • Gray

    Brilliant article.