Skip to content ↓ | Skip to navigation ↓

The “good news” is that, unlike the vulnerabilities with GnuPG and similar strong encryption, this attack was just a classic brute-force compromise of the web hosting platform and doesn’t have anything to do with the widely-used OpenSSL software, itself (a significant percentage of websites use OpenSSL for browser security).

The article mentions that the checksums of the OpenSSL software, hosted independently on GitHub, appear unaffected. The article also mentions, in brief, how the attack was effected: bad passwords.

And that’s really the bad news. For all the cleverness of the mathematicians and software engineers and hackers in putting together first-rate software, their website (hosted on a second-rate, discount provider in Sweden) was compromised either because of poor-quality passwords and/or inadequate password change policies on the part of the OpenSSL team or, at least as likely, loose security and the same lack of good policy and procedure on the part of the webservice provider.

The most likely scenario is an ex-employee or contributor to the project leaked access codes, performed the break him/herself, or simply didn’t guard his/her information carefully enough. To this day I can still remember the root / administrator (privileged user) account password for *all* the Unix, Linux, Windows, servers that Loudcloud had deployed for *all* its customers around the world.

It wasn’t a bad password — more than 8 characters long, upper and lower case, numbers and symbols, no repeated characters — but it was also highly memorable (especially after typing it a few dozen times per day!)… and it was never changed.

That was 13 years ago. Of course, Loudcloud is no more and all the assets have changed hands several times… but at times like this I wonder if I could send myself a pair of shoes from the Nike website or improve my stock portfolio through one of the management companies we used to host.

I’m oversimplifying greatly — there are still plenty of security hurdles to get around before one even has an opportunity to use a secret password or other privileged information — but this attack on is yet another demonstration of not only the possibility of compromise but that the solution is not in the technological complexity but in the human factor.

Far more “nerve rattling”, IMHO, is the author’s conclusion about being “hacked”: “Users should demand a thorough autopsy. And while they’re at it, they should demand that the official maintainers of both the PHP Web scripting language and the Linux operating system kernel make good on promises to provide autopsies of serious compromises on their own servers.”


First, a logical problem: a project’s website usually has nothing to do with the software that the project provides, even if one can download the software “from” the site (which is typically a redirect to a different site on a different system with different security). To confuse this issue contributes to a poor, fear-based understanding of what the issues at hand actually are.

Now, I grant that a “hack” enabled by bad passwords on a cut-rate service provider is pretty embarrassing for any such project. *BUT* the purpose of Free and Open Source Software (FOSS), its very reason for existence, is for the *user* of the software to be able to find and fix bugs, rewrite the software to suit one’s own purposes, contribute back to the community, re-sell the software as part of a larger package.

In short, the onus of the responsibility for the use and quality of the software is on the *user*! This is the direct antithesis of the classic software-company model where the user is legally *denied* the right to responsibility for the software and must rely on the the vendor for fixes and improvements — which may or may not be at all what the user desires but is paying for in any case.

The middle ground also exists: there are companies that write or customize [FOSS] software, fix critical bugs, according to the desires of their customers. Of course, this middle ground can *only* exist in the FOSS environment, since commercial software cannot (legally) be modified.

Many independent studies have found that, generally speaking, FOSS tends to be of higher quality than commercial software. That is, it performs more efficiently, is more extensible. Not to mention that FOSS can always be highly customized by a sufficiently motivated user.

That said, commercial software, especially from virtual monopolies such as Microsoft, Apple, and, increasingly, Google and other online / cloud providers, has one significant (in the eyes of many users) benefit: it tends to be much more consistent between products, general reducing the learning curve between *how* one interacts with different pieces of software on the same system.

(Microsoft, oddly, tends to scuttle this benefit by introducing significant and often-unnecessary changes between major releases… though this just seems to be proof that their customer base is comprised largely of victims of Stockholm’s Syndrome.)

This benefit, of course, is also its own limitation; all of the vendor’s software *has* to work “the same” (from a user management perspective) or the user will perceive it as being broken. With FOSS, on the other hand, a higher learning curve between software packages on a single system also allows for greater flexibility on both the part of the designers and for the implementers / users.

The problem in both arenas that is becoming evident is that, with a general desire to make increasingly large and complex software “easy to use”, most software packages come riddled with security holes: not flaws in the design exactly, but simple configuration problems and un[dis]closed “back doors” because the royal We have been tricked into believing that, because the software was easy to install or get running, its functioning is also easily understandable and highly comprehensible.

This *is* the fault of the user, regardless of what software s/he is using. But, too often, the user doesn’t want or can’t afford to take responsibility.

In any case, the developers of and contributors to FOSS are rarely paid directly for their [coding] efforts (many of them do make good livings via consulting, publishing definitive books on the use of their software, or being employed by a company that encourages or supports FOSS projects).

Although the FOSS community, including the original developers of any given project, are often highly responsive to problems, concerns, bug reports, the simple fact is that *all* FOSS is released with a legal caveat along the lines that the software is not guaranteed for any merchantability or fitness for any purpose.

It has to be that way. The end-user has to take the responsibility. The tendency to actively deny this responsibility and encourage others to do so as expressed by the author of the article is at least as frightening as the bad password procedures that led to him having something to write about in the first place.


About the Author: Walter Davies is a Unix Systems Engineer, musician, and scifi buff with 20 years of high-tech industry experience. He has survived for the past 8 of those years without a cat.


Editor’s Note: The opinions expressed in this and other guest author articles are solely those of the contributor, and do not necessarily reflect those of Tripwire, Inc.


Related Articles:



picThe Executive’s Guide to the Top 20 Critical Security Controls

Tripwire has compiled an e-book, titled The Executive’s Guide to the Top 20 Critical Security Controls: Key Takeaways and Improvement Opportunities, which is available for download [registration form required].


picDefinitive Guide to Attack Surface Analytics

Also: Pre-register today for a complimentary hardcopy or e-copy of the forthcoming Definitive Guide™ to Attack Surface Analytics. You will also gain access to exclusive, unpublished content as it becomes available.


Title image courtesy of ShutterStock