Skip to content ↓ | Skip to navigation ↓

Gather around the server script kiddies for I have a scary tale of woe. What makes this story all the more scary is that it is true.

Years ago, late at night I was startled awake by the buzzing and alarms of my cell phone, as a series of SMS messages came through from various monitors I had in place warning me that a website I managed was down.

I stumbled to my home office and fired up my laptop to see what the problem was. The site was down, nothing via web browser, or port 80, but the server was still up.

My first thought was that it might be a data center issue, maybe down for maintenance, but when I accessed our providers systems there were no issues and according to their control panel our servers were all operational.

The plot thickens. Then my heart sank as I thought “We’ve been hacked, the server is compromised!”

My next thought was that there was a server issue and maybe a simple reboot of the server was in order. I rebooted the system and this time got the Apache default page at least I was getting a response, I restarted Apache, still nothing.

Then when I checked the Apache configuration file I made an interesting discovery. The configuration file name was modified. The file ownership and permissions had not changed, but the time stamp and file name had been modified at around 4:30AM.

It Came From Inside the Company!

I checked the login history and sure enough someone had logged into the system. It was a contractor hired by the company to help configure systems, but who had completed his work a month prior. The company failed to remove or limit his account after the job was over, thinking he may do more work in the future.

last

Then looking at the logs it was simple  to see what had occurred:

Bad Commands

He logged into the system, shut down the web server and then moved the Apache configuration file so it would not load when Apache was restarted, or if the system was rebooted.

Why would he do such a thing? The contractor was in a dispute with management, so to make a point he sabotaged the server. Nobody knew this except for one or two managers, none of whom knew he still had access to the servers.

The evidence was gathered and the company not only terminated the contractor, but also considered prosecution as there was more than enough evidence to show the sabotage was willful and with the help of additional emails from the contractor showed clear intent and motive. At the time the service he had brought down was being used by law enforcement and as a result he unknowingly interfered with a police investigation which could have lead to additional charges.

The moral of this story is to keep track of who has access to your systems, limit contractors access, log what is being accessed and detect when there are significant changes to system configurations.

Particularly when dealing with disgruntled employees and contractors HR and IT departments need to work together to initiate stronger measures to mitigate risks posed by employees with privileged access.

Looking back now, I realize how Tripwire Enterprise  would have helped substantially in this case. Tripwire Enterprise provides several controls as well as forensic tools that would have helped in this case, from flagging configuration changes made at odd times (4:30AM), quickly identifying what user account made the changes (even if they are running under pseudo) as well as the ability to flag a user who is considered high risk.

This comoany only had  a handful of servers at the time, but if an organization has to manage hundreds to thousands of servers with thousands of employees you begin to see how Tripwire not only mitigates risk, but saves organizations time.

The scary thing is that the company chose not to prosecute the IT contractor and he is still out there somewhere… BEWARE! 

 

Related Articles:

 

P.S. Have you met John Powers, supernatural CISO?