Skip to content ↓ | Skip to navigation ↓

If we put aside policy and politics around the PRISM news, this is actually a story of a successful application of a “Big Data” approach to security analytics.

In contrast to other Federal programs branded as “security theater” this one appears to actually use security threat data and security analytics to fuel a systematic approach to find potentially dangerous situations.  And, if the government’s claims are accurate, this approach has identified threats in time to do something about them before attacks occurred.

In other words, it seems to live up to the promise of Big Data security analytics.

The US Government has had this program in place since 2006, and I imagine they’ve learned a lot about how to be successful in this kind of analysis during that time.

Unfortunately, we probably will never gain access to many of those lessons learned, so we won’t be able to apply a lot of the best known methods in the enterprise.  Too bad, they have a multi-year head start on solving the security analytics problem, and that learning could help a lot of enterprises.

Enterprises Won’t Be As Successful with Big Data Security Analytics Anytime Soon

I can think of quite a few other reasons why enterprises won’t be as successful with big data security analytics as the PRISM program, at least in the near term.  For example:

  • Number and diversity of data sources.  Enterprises won’t have access to enough data sources to achieve the accuracy of PRISM.  Even if we did, few enterprises have the capacity to handle that volume of data effectively.  Current SIEM approaches don’t do well with long-term/historical data analytics; full packet capture is useless if you don’t know what you’re looking for; DLP is a fire hose of false-positives; etc.
  • Expertise and human capital.  Most enterprise security teams are running at capacity already, are operating as “generalists,” and don’t have the ability to focus on one single task day in, day out.  And, most enterprises don’t have the in-house skills to do advanced data analytics on large data sets, at least not yet.  I’ve seen some companies aggressively hiring quants to help with this, but the resource pool is scarce.
  • Clarity of mission.  PRISM has a pretty clear mission, with a constrained problem space (yes, it’s big, but it’s focused).  Enterprises have very broad remits, and the problem space is very large.  That makes it more difficult to create a scalable, deterministic analysis capable that satisfies general purpose security.
  • Access to data beyond your organization.  This is a biggie.  Even with organized threat sharing (assuming it actually becomes consistently effective), enterprises are limited by the data they see and collect on their own.  This limited perspective will govern what we can ultimately discover with enterprise security analytics – a lot of the stuff you need to know to detect threats and attacks happens “out there” where an enterprise will never see it.
  • Some folks may not like what we’re doing or how we do it.  This is one area that is consistent between PRISM and the enterprise.  The more we gather, and the more we analyze, the more likely we are to step on someone’s toes or make someone uncomfortable with what we’re finding.

Those are just a few of the obstacles.  Obviously, this doesn’t mean we shouldn’t continue to pursue better security analytics using Big Data but it won’t be a piece of cake.

 

P.S. Have you met John Powers, supernatural CISO?

Hacking Point of Sale
  • K Snapper

    Dwayne — in my opinion, the efficiency and effectiveness of PRISM remains unclear. Electronic surveillance was a proven success before PRISM, and any success may derive simply from expanding the scope of surveillance and collection. The issue is the added value of PRISM analytics, as distinct from facilitating use of already proven techniques but on a much larger scale. In my opinion, there are certain new analytic and technical capabilities that PRISM would require to make the case for big data as distinct from more data. I won't speculate one way or the other, but my opinion is that touted successes absent key details are inherently ambiguous in regard to the case for big data.