Are you meeting your perceived security obligations?
Three trending topics (The S.C. Hacking incident, Sophos vulnerabilities and the SEC lack of encryption) this week have me thinking about what security is obligatory; as well as what security response is defined as reasonable; and if that tide is rising across multiple verticals. The articles in both cases highlight existing disconnects across multiple audiences for expectations. When you combine the orientation of the articles, with the evolving law dialog about reasonable standard of care, the impact of this evolution is probably relevant to every company with a cyber presence.
I’m going to provide a lot of quotes from the various articles, for the purpose of identifying what feels like a set of trends in the tone of reporting. At the bottom, I’m going to talk about what I inferred of the tone in these quotes; and look at what that might mean for organizations.
1) Travis Ormandy (the security researcher who found the Sophos vulnerabilities) asserts that not Sophos should not have been shipping with the vulnerabilities, and once they were alerted:
2) Sophos should have been able to ship a fix sooner, and further posits that Sophos was “clearly ill-equipped to handle the output of one co-operative, non-adversarial security researcher. A sophisticated state-sponsored or highly motivated attacker could devastate the entire Sophos user base with ease.”
3) This leads him to conclude that “you should be willing and able to disable Sophos installations across your fleet; and exclude Sophos products from consideration for high value networks and assets”.
4) When discussing the story Information Week focuses in on Sophos as a security company, and if they “side-stepped secure coding practices and failed to embrace modern attack-mitigation technologies”
Moving on to the SEC lacking Encryption data points:
- “SEC has been encouraging other organizations to pay attention to security”
- “SEC spent $200,000 to check whether it had lost critical information”
- The division that had the unencrypted machines was the same division that was supposed “to oversee compliance with rules and writing regulations for exchanges and brokerages” (for the major equity markets).
- “These policies essentially map out each exchange’s infrastructure in a level of detail that would be a boon to anyone looking to hack the most lucrative markets in the world”
Lastly, some early retrospective on the South Carolina Dept. of Revenue hack:
- “The state and a cybersecurity company acted negligently in allowing a state database to be hacked”
- “The Revenue Department chose not to use the free security monitoring offered by the Division of State Information Technology, a unit under the S.C. Budget and Control Board.”
- “The public is forced with the threat of jail to pay taxes and give their personal information to SCDOR, and yet SCDOR took only the flimsiest steps to protect this private data”
- “But Hawkins will attempt to prove that state officials and Trustwave violated a provision in state law that requires state agencies to disclose a breach of personal identifying information to taxpayers following discovery or notification of the breach.”
- “Avivah Litan, an analyst with Gartner, called the governor’s defense of the state’s security practices shaky.”
- “There are many other methods that are viable and, when used together, offer more protection than just encryption alone.”The governor’s comments reflect unawareness of data security practices and are not at all reassuring,” Litan added.”
- “Under most state data breach laws — including South Carolina’s — encryption provides businesses with safe harbor from notification in the event of a data breach”
Now, with all that article verbiage quoted above, I think I can draw some succinct themes that span all three trending topics.
- The news media (at least) is expecting people who have any type of sensitive information to be protecting it, and keeping those protections up to date.
- That protection should reflect “modern” understanding of technologies and defense in depth.
- Executives (or governors) are expected to be able to speak, at least generally, to these topics, or have someone else (who can) do the talking.
- Organizations that hold large amounts of sensitive data in trust, such as governments and security companies are expected to be more responsive and protective of that sensitive data.
- When a something considered a security issue is uncovered, the responsible organization should have a fast acting response to explain the problem and solve it in the field expediently. In the Sophos case, Travis posits that a response that takes longer than a month is a problem.
Where does this fall down today?
- Although more CISOs and CSOs report to the board than ever before, that categorically does not mean the board or the Chief Executive, or his executive staff is aware of the technical details of security. In fact, that’s something I would consider an open question. Should an executive or governor be expected to know these details?
- The definition of sensitive data, as soon as you talk to a business or consumer, is probably much larger than that defined by law; and there is no effort I am aware of to close that gap in language. I personally suspect it would go poorly for your impacted organization, if you are in front of a new camera trying to defend why you protected items as required by law; and not as perceived by the consumer; regardless of the legal obligations as defined by your state, country or regional law provisions.
- Security professionals today identify lack of qualified talent and lack of organizational funding as a key problem to their daily job; which probably implies that they are doing what they can with what they have; which likely may not meet the expectations identified in these trending articles.
There are possible forward looking statements we could therefore conclude.
- If both news articles and law precedent are raising the expectations about what to do with sensitive data, your organization can chose whether to be an early or late adopter to those changing expectations.
- As with all decisions, your organization needs to weigh the risks.
- Being an early adopter probably means an increased investment in security, but a lower cost if a breach occurs. Given the security industry statement that it’s a when, not an if, this might be a good bet.
- Being a slow adopter means that you defer the initial costs to increase security; but if this particular risk triggers, it’s likely be more expensive for slow adopters; and if the worst happens, could shut your business down due to expense or reputation damage if others in your space chose to be early adopters.
News image via Shutterstock.