Skip to content ↓ | Skip to navigation ↓

It appears we reached a global level of moral outrage surpassing a high warp factor during the week commencing 19th March 2018 with widespread news coverage of the machinations of Cambridge Analytica (CA). [I write as a long-time cynic who did not need to experience last week to know that “I am the product.”]

We are all making choices daily – for example, allowing the location features to be turned on, so that our devices can more easily orientate us in various applications. There is a wealth of data being gathered about our preferences on a daily basis – unless, of course, you have entirely abstained from not just social media but also online shopping, cloud-based email services, map applications etc.

Everyone appears to have a left hand vs. right hand challenge, actively participating in trivial, nonsensical surveys, quizzes, games through online social media fora and yet expressing extensive indignation at the very notion that the resulting data might be mined, shared and utilized for nefarious purposes. Many of today’s applications and services appear to be suffering from a level of neediness that feeds the immaturity of the non-paying users.

Picture Audrey II, the big flower in the basement of the Little Shop of Horrors; these systems need constant feeding in order to hit usage targets and data volumes sufficient to please shareholders. Without sufficient caution, users are actively feeding Audrey II daily by responding to and sharing these applications. Repeat after me: we are the product.

To recap, a breach at Facebook did not occur, though knowledge has existed of what was, in fact, clearly a breach of contract since 2015 (the activities themselves having taken place in 2014, four years ago) when Facebook discovered that Professor Alexsandr Kogan, having utilized its application functionality to undertake psychological research – an activity which is exempt under Section 33 of the current Data Protection legislation[1] – “naively” (his word) shared the resulting data.

Paul Grewal, Deputy General Counsel for Facebook stated that:

“The claim that this is a data breach is completely false. Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent. People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.”

The operation of “bad actors” was erroneously linked with this contractually agreed activity. However, it is clear that Kogan acted in ignorance it appears of any understanding of his existing contractual, [He had signed a Non-Disclosure Agreement {NDA} with Facebook.] legal and academic obligations, which shames him as a researcher.

Your conspiracy theorist antenna should, naturally, be heightened when you understand that Professor Kogan, conducting his research through the University of Cambridge, is a Russian-American who continued to return to Russia during his tenure. The psychological test developed (entitled “thisisyourdigitallife”) – for which participants used their Facebook credentials to log into and were paid to take part – resulted in data that helped both advertisers and political pundits understand a chosen electorate.

Operating the six degrees of separation approach 270,000 participants became upwards of 50 million, by association, through screen scraping of friends’ data – an activity for which those friends will not have explicitly consented to.

The Facebook application places unique identifiers in cookies. Users have control over their cookie settings and whether to accept them or not. Not accepting them usually means some lack of expected engagement or functionality. Everything in life is a trade-off – your activity online is no different. Ultimately, effective Facebook privacy controls would have stopped the screen scraping.

Amongst all this is Christopher Wylie of Eunoia Technologies playing the role of whistleblower. He now feels guilty and has engaged positively with the authorities in order to assuage his guilt at being involved in these data aggregation activities, believing that by doing so his actions amount to taking responsibility. This is another example of falsely directed virtue signalling. If Wylie was as concerned as he claims, then he should have been much better educated and read in the space. We have spent years seeking to professionalise the IT industry, and these revelations do us nothing but harm, ethically, both presently and moving forward.

The implications that SCL and/or Cambridge Analytica have been involved in foreign election tampering as a result of political data mining should not be considered particularly alarming. The wider context here is Information Warfare – not something that is new. RAND provided government research on the future implications of the impact of information warefare as far back as 1996. Denning (1994), Szapranski (1995), and Schwartau (1995) provided the backdrop to this work.

The fact that Facebook allowed the “data grapping” application development as recently as 2014 makes its fundamental business premise in this regard illegal within the context of consent requirements of both the UK Data Protection Act 1998 (and thus the whole of Europe’s 1995 Data Protection Directive) as well as the Privacy and Electronic Communications (EC Directive) Regulations [PECR], soon to be replaced by the e-Privacy Directive 2003.

That Facebook is now considering further restrictions on application development is somewhat late after multiple horses have collectively bolted through all available half-open stable doors. Treasure troves of our collective personally identifiable information (PII) exist in many forms, in many locations, in databases, in spreadsheets, in clouds…

Nonetheless, this is NOT an unregulated space – though this is what all the media pundits would have you believe. Shame on them for not being better read, frankly. I’ve been maintaining a database of information-based legislation for over fifteen years. The world is awash with both complementary and competing flavours of privacy and data protection related laws. All of this activity is ultimately bound in contractual law. Unfortunately, our procurement colleagues are not as aware as they should be, nor are many company lawyers as to how best to manage the legislative requirements with regard to information management

However, contractual arrangements continue to be the managed erroneously in reverse. If the UK government were to engage SCL or CA, then the UK Government should write the contract – as the Data Controller – rather than blindly accepting whatever contract was presented to them by their supplier. The Data Controller must stipulate the manner in which data is used, shared, processed, stored, retained and destroyed. In each and every case.

This is a key and pressing action, for all organisations processing personal data who have a need to update their procurement practices to align with compliance expectations of the EU General Data Protection Regulation (GDPR). Records of Processing Activities (ROPA) that identify all third parties are required along with the ability to evidence the existence of NDAs, contracts or Data Sharing Agreements (DSA)/Data Processing Agreements (DPA). This is no easy/quick task. The requirements apply to Facebook (and others) given the volume of EU citizens personal data they process.

The clock is rapidly ticking to GDPR deadline date of 25th May 2018. Reach out to your friendly consultant(s) for assistance ASAP!

 

Andrea C SimmonsAbout the Author: Dr Andrea C Simmons, PhD, FBCS CITP, CISM, CISSP, M.Inst.ISP, MA, ISSA Senior Member is owner and director of www.i3grc.co.uk. Andrea has more than two decades of direct information security, assurance and governance experience, helping organisations establish appropriate controls, achieving and maintaining security certifications in order to ensure information protection is adequate for their crown jewels.  Her work has included the development of a trademarked and patentable enterprise governance, risk & compliance (eGRC) approach to addressing business information governance needs. Whilst also spending the last 8 years researching Information Assurance, Andrea has published two security management books. She can be reached at andrea.simmons@bcs.org.

Editor’s Note: The opinions expressed in this and other guest author articles are solely those of the contributor, and do not necessarily reflect those of Tripwire, Inc.


[1] http://www.adls.ac.uk/wp-content/uploads/2011/04/Section-33-of-the-DPA-a-practical-note-for-researchers.pdf

['om_loaded']
['om_loaded']
<!-- -->