Skip to content ↓ | Skip to navigation ↓

So, it has passed into the obscurity of the 24-hour news cycle, but recently the U.S. President and the UK PM both made public statements advocating for the government to have a way to decrypt our data at any time. Suggestions have ranged from key escrow schemes to backdoors in the encryption software, and presumably a myriad more I haven’t heard of.

The main push is that national security interests outweigh the privacy interests of individuals and organizations. As a software developer and security aware person, I can’t imagine anything worse for security or privacy than these sorts of suggestions.

The arguments in favor of initiatives to weaken encryption and security protocols generally fall under the heading of making a specific group safer at the expense of the privacy / security of another group. This fails, first and foremost, as an argument in that it makes everyone weaker, including the group being “protected.” These arguments also fail factually in that there are no cases of prosecution hinging on recovered encrypted data that I am aware of.

In fact, most reports I read, the criminal cases like that of Dread Pirate Roberts, really depend on old-fashioned police work, not technology skills. Additionally, the fact that these systems generally depend on the “good intentions” of the government and the people responsible for acting on its behalf should frighten anyone, given what we know about the corruptibility of humans and the likelihood that any given person will abuse the power they are given.

As a software developer, the whole conversation is a distraction from the problems that this sort of behavior creates downstream. The addition of schemes to make encryption protocols friendlier to government interests never touch on the weaknesses that such mechanisms are destined to introduce in the software and systems we build.

At a most basic level, any piece of software that does meaningful work is sufficiently complex as to have behaviors that the developers never intended, some we turn into “features,” others are bugs (or vulnerabilities). Over the course of my career, I have encountered many initiatives(?) like KISS or agile processes that express the fundamental tenet that we should make software only as complex as is necessary to perform the job at hand.

It would seem obvious to me then that any change to a piece of encryption software that makes the encrypted data recoverable adds complexity (additional if blocks, for instance). In the addition of this complexity, we are most likely to introduce bugs; unintentionally of course, but it will happen. And these defects will compromise the security of the systems the encryption is used to protect.

No matter how well-intentioned these efforts are, whether it is an escrow scheme or a software backdoor, we are likely to end up a whole lot less secure and private. The proposals always include “safe-guards” or mitigations against misuse and abuse, but one thing should be clear by now, governments are groups of people and people abuse power and make mistakes.

And given what we know about the investigations and prosecutions of cases involving recovering encrypted data, the trade-off isn’t worth it.