Several weeks ago Bruce Schneier asked is readership if they wanted to make a deal: Buy a signed copy of Liars and Outliers at a substantially reduced price in return for writing a review.  I took him up on that offer, as did several others.  While my review has been delayed, I wanted to ensure that I was able to absorb (I think I have) what Mr. Schneier has to say about a topic that is a departure from his usual subjects.  Additionally, I wanted to make some attempt to apply his construct to the concept of information risk management.

Liars and Outliers is such a departure from his usual technical subject matter, that I was initially concerned that Mr. Schneier may have overreached.  It’s not every day that a security guru talks about theories of coercion and explains how society establishes and maintains trust – that’s about society, not components in an IT system!   Still, as many others have pointed out, Mr. Schneier has done a bang-up job of distilling what we know as trust into an applicable framework to aid in further discourse.  In fact, of the reviews I read, only one was negative (and, in my opinion, a bit harsh).

The Liars and Outliers Construct

Trust in our society is a system, and before we can really start to understand it, we need to come to define a few terms.

  • Group: At any size greater than one, a society used to classify individuals (organizations in some cases)
  • Cooperator: An individual or organization who goes along with the group norms in a given context
  • Defector: An individual or organization who goes against group norms in a given context
  • Dilemma: An interesting situation where no choice is advantageous for either party
  • Societal Dilemma: A dilemma at a larger scale, affecting that which we would consider a large group
  • Societal Pressure: Forces affecting individual and/or organizational decisions when faced with a societal dilemma
Mr. Schneier outlines four societal pressures (using Mr. Schneier’s words to describe each):
  • Moral: “A lot of societal pressure comes from inside our own heads. Most of us don’t steal, and it’s not because there are armed guards and alarms protecting piles of stuff. We don’t steal because we believe it’s wrong, or we’ll feel guilty if we do, or we want to follow the rules.”
  • Reputational: “A wholly different, and much stronger, type of pressure comes from how others respond to our actions. Reputational pressure can be very powerful; both individuals and organizations feel a lot of pressure to follow the group norms because they don’t want a bad reputation.”
  • Institutional: “Institutions have rules and laws. These are norms that are codified, and whose enactment and enforcement is generally delegated. Institutional pressure induces people to behave according to the group norm by imposing sanctions on those who don’t, and occasionally by rewarding those who do.”
  • Security Systems: “Security systems are another form of societal pressure. This includes any security mechanism designed to induce cooperation, prevent defection, induce trust, and compel compliance. It includes things that work to prevent defectors, like door locks and tall fences; things that interdict defectors, like alarm systems and guards; things that only work after the fact, like forensic and audit systems; and mitigation systems that help the victim recover faster and care less that the defection occurred.”

These are, in effect, the dials that control the general outcome of a given societal dilemma, and they all work to varying degrees depending upon the scale of the problem (yes, scale changes everything).  Moral pressures work in smaller groups; reputational pressures work in mid-sized and even larger groups (given technological assistance – think about eBay’s reputation system), institutional pressures work in mid-to-large sized groups; security systems essentially augment all of the others.

An example should help to put these pieces together – I’ll borrow a bit from the book and talk about criminals testifying against each other.

  • Societal dilemma: Criminals testifying against each other
  • Society: The criminal organization
  • Interests
    • Group: Minimize punishment for the society
    • Competing: Minimize personal punishment
  • Norm and Defection
    • Group norm (cooperation): Don’t testify against each other
    • Corresponding defection: Testify against each other in exchange for reduced punishment
  • Societal Trust Mechanisms
    • Moral: People feel good when they support other members of their group, and bad when they let them down
    • Reputational: Those who testify against their fellow criminals are shunned
    • Institutional: The criminal organization severely punishes stool pigeons
There are numerous other examples in the book, but this one is pretty easy to grok, though it doesn’t show any security systems in place (other examples do).  This is the basic construct explained throughout the book, and there are many nuances and imperfections – it’s not supposed to be perfect, but is supposed to frame a meaningful discussion.  In fact, I’d like to move to just such a discussion in the specific domain of security and compliance, or, if you prefer, risk management.

The Four Truths

There are, as I have written before, Four Truths we have to deal with in this security and compliance world. Our threat agents (defectors, from our perspective) are increasingly motivated and agile. The complexity of our systems is ever increasing (societies of various measures continue to grow beyond the reach of certain societal pressures). Our situational awareness (or, should we start to say our “risk awareness”) changes quickly. Our resources are scarce – we always have trade-offs to make.

These Four Truths are, in effect, embodied, if not more fully explained, when Mr. Schneier talks about how societal pressures are affected by scale. He specifically mentions data migration to the internet (increasing complexity), technological mediation of social systems (increasing complexity), migration of evolved social systems into deliberately created socio-technical systems (increasing complexity), class breaks (increasing complexity), automation (situational awareness/complexity), action at a distance (a nod to threat agents), technique propagation (a nod to threat agents and situational awareness), technique iteration and improvement (again, threat agents and situational awareness), defector aggregation (motivated and dynamic threat agents).  The fact that our resources are scarce only underscores the importance of recognizing the other points.

What I began to recognize, and as Mr. Schneier pointed out (along with several reviewers), there is no silver bullet to “solving” the problem.  In fact, there really isn’t a problem to solve – there’s more of a problem to recognize.  A particular sentence stands out and should help shed light on what I mean: “Because we are all simultaneously members of many different subsets of society with competing interests – family, friends, colleagues, party, nation – we need security failures for progress.”  The problem is not to ensure everyone can trust everyone when they need to.  The problem is to ensure that, for a given societal dilemma, we’ve minimized defection to a tolerable level.

To put this another way: We’re playing the wrong game.  From my perspective, it seems that we (the information security industry) continue to view the battles we face on a daily basis as a zero sum game, thus thinking we can win!  The reality is that we can never win – there is no winner with the zero sum approach, so we need to modify the game so that dilemmas are mitigated and defection minimized.  It really is a different paradigm, and I hope to see further study in the area based on Mr. Schneier’s work.

Changing The Game

As mentioned, there are four societal pressures: Moral, Reputational, Institutional, and Security Systems. However, as society becomes increasingly globalized, the effectiveness of these pressures changes. Mr. Schneier recognizes this by associating certain pressures to small, medium, and large groups (scale changes everything).  In our sphere of influence as security and compliance practitioners, I would argue that Institutional and Reptuational pressures are the best chances we have for leveraging societal pressures to reduce the overall scope of defection, which is necessarily augmented by effective security systems.  Moral pressures can only be brought to bear in smaller group settings, and this is where management styles, interpersonal relationships, and all those “soft skills” everyone’s always talking about will come into play (I think).

Organizational policy is the institutionalization of organizational desire and, therefore, a societal pressure in the group that is the organization. Human Resources, management philosophy/behavior, and peer interaction may all be appropriate sources of institutional, reputational, and perhaps moral pressures inside the organization. Each of these are, in turn, affected by the individuals comprising the subdivisions of the organization, which makes considering the social dilemmas for even a single organization a mind-blowing exercise.

As mentioned throughout the book, defectors are always present. The tough balancing act we have (as a global society) is to find the right set of societal pressures that minimize the number of, and damage caused by, defectors. Given that some contexts require very few defectors to cause catastrophic damage (consider airplane-related terrorism), we can start to frame our efforts in terms of risk management.  Similarly, we can also see that detection and response are critical components of dealing with the fact that defectors are ever present.

How Does Risk Fit In?

When it comes to the risk trade-off, Mr. Scheier most often presents risk from the perception of making a cooperate/defect decision. We could look at this in the converse in that when designing societal pressures (it really needs to be about design), we should consider the risk trade-offs from our perspective in addition to those from the individuals making cooperate/defect decisions.  Remember, the goal is to reduce defection in a given context to an acceptable/manageable level.

As he mentions, there are many ways for societal pressures to fail, institutional pressures are particularly prone to a variety of complex failures. Yet, it still seems that we see a lot of various laws and regulations being published, but which provide very little value for the necessary resource expenditures.

If you accept the possibility that every societal pressure we design and implement is an exercise in risk management, then we can start to frame discussions in risk-based terms.  What is the threat?  What vulnerabilities could lead to the threat being realized?  What mitigation strategies exist for those vulnerabilities?  What are the measurements we use to determine probabilities of these vulnerabilities being exploited?  If we accept that societal pressures can be framed in risk management terms, then we can start looking at how we construct control frameworks in more practical ways.  Should they be verbose? How can we be terse but expressive? What are the right ingredients?

It turns out that the right ingredients are at least suggested in Liars and Outliers.

  • Everyone understands the group interest and knows the group norm
  • Group norm is something the group actually wants
  • Institutional delegate enforcing the group norm must be accountable to the group (effectively self-regulated)
  • Penalties for defecting must be commensurate with the seriousness of the defection
  • System for assessing penalties must be consistent, fair, efficient, and relatively cheap
  • Group must be enabled to develop its own institutional pressure – not imposed from without
  • Larger groups need to be scaled and nested in layers, each operating along the same lines

Additionally, if taking the risk approach leads to a more detection and response centered perspective, then what does this do to the importance of disaster recovery and business continuity planning?  It seems that DR/BCP is really all about response, which would seem to place it straight in front of risk. In fact, I’ve long thought that taking on DR/BCP as an initial effort would naturally steer an organization toward good risk management practices.

In Conclusion: A Call To Action

We should start rethinking the way we view the world in our industry, which is loosely defined as “security compliance” or “information risk management.” What would the world look like if we started from the perspective of DR/BCP and risk management, where everything we focus on is first viewed as a vulnerability, then assessed for some level of criticality that is meaningful to us?

On the broader level, what if lawmakers did the same thing? Would we still have Three Felonies a Day? Should we? Probably not.  Rather than go on about who could take what action, I guess I can put this another way by answering the question: Who should read this book?  At a minimum, organizational policy makers, “hands-off” security professionals, institutional operators, lawmakers, spiritual and cultural leaders.

In fact, it would be beneficial to send a copy to your lawmakers at a variety of levels (municipal on up to the federal level). Or, if you don’t want to buy a copy of the book, send a few reviews to them (there are many places to look – just go to Mr. Schneier’s book page or see some of the links above; or you can send this one, if you’d like).  If your management is responsible for designing your corporate policies, leave a copy on their desk. We’re definitely moving into a global society, and the more we do, especially at increasing speeds, the more critical getting our institutional pressures right becomes. The framework Mr. Schneier presents is the best start I’ve seen to framing an intelligent discussion regarding the trade-offs.  The construct is fairly easy to “see,” though complete understanding takes some effort.

Here are a variety of links so you can reach your elected officials.

 

Categories: , Risk-Based Security for Executives, ,

Tags: , ,


Adam Montville

Adam Montville has contributed 32 posts to The State of Security.

View all posts by Adam Montville >