Skip to content ↓ | Skip to navigation ↓

Last week, I shared a bit about the characteristics of effective metrics, according to Jeffrey Wheatman of Gartner.  As a result, I received a few questions asking if I knew of good metrics people were using “in the real world” to measure security effectiveness.

I have a list to share which will give you some ideas – or at least some examples. Some of these are metrics I’ve seen, while others are metrics I’ve helped establish:

Configuration Quality:

  • % of configurations compliant with target security standards (risk-aligned)
    • i.e. >95% for Critical systems;  >75% for Normal
  • number of unauthorized / undocumented changes
  • patch compliance by target area based on risk level
    • i.e. % of systems patched within 72 hours for Critical; …within 1 week for Normal

Control effectiveness:

  • % of incidents detected by an automated control
  • % of incidents resulting in loss
  • mean time to discover security incidents
  • % of changes that follow change process

Security program progress:

  • % of staff (by business area) completing security training
  • average scores (by business area) for security recall test

These are just a few examples, spanning several categories.  By the way – the ones in that last category that specify a breakout by business area?  That’s designed to create a little inter-departmental competition.  When you report on metrics in this way, nobody wants to be last on the list so you see some interesting gamesmanship emerge when you bring out the departmental breakdown.

Would any of these be useful to you?  Do you have any others to add to the list?  I’d love to hear from you.