In this episode, Lisa Forte, partner at Red Goat Cybersecurity, discusses what happens when organizations go unprepared for an inevitable cyber breach. She explains how practicing your breach plan is the best line of defense in preparing your strategy.
Every cybersecurity practitioner dreads that moment when they realize that a breach has actually occurred in their organization, their data has been copied or encrypted, or they receive a phone call from a third party notifying them that their data has been found available for sale somewhere online. Fortunately, there are companies that can help to lessen these horrible occurrences. Lisa Forte is a partner at Red Goat Cyber Security, which specializes in handling the moment that most of us dread and try to avoid.
Tim Erlin: Lisa, tell us about some of what you and the other members of Red Goat can do to help a business not only to prevent but to also recover from such dire situations.
Lisa Forte: Yes, this is my area of expertise. I have seen C-Level executives panic and bury their heads in their hands as I run scenarios for them to test their plans and their teamwork during a crisis.
TE: What does an incident such as a breach look like for an unprepared organization? What's likely the first thing that happens with that type of an organization when a breach occurs?
LF: I've actually found that the more unprepared an organization is, the more tempted they are to lie or try and conceal what's happening. It's really interesting; if you run a scenario with a company that's done no preparation whatsoever, they go to an immediate knee-jerk reaction. For example, an unprepared company may just say that the website was down, the data hasn't actually been leaked, or nothing has happened. I would advise such a company that, strategically, they may not want to take that approach. A company may respond that way simply because there is no plan of what on earth to do in that situation. So, because of that, they decide the best thing to do is just deny it.
TE: That is really interesting. As practitioners in the information security industry, we understand that every organization is potentially a victim no matter how well-prepared they are. There's still the potential that a really well-resourced adversary is going to manage to compromise them. But if an organization is unprepared, that idea that the first instinct is a defensive one, to lay blame somewhere else, is an interesting psychological approach.
LF: It is a bit like when we were children. When I was a child, if I broke an ornament or something of my mother's whilst playing, I would tend to try and put it back in such a way so it looked like it could have been my sister’s fault. That's where we still are as adults when we're not prepared. We think, “I'm going to be in so much trouble” and “This is so bad. We don't have a plan. How are we going to deal with it? Let's just try and blame someone else or blame something else and get out of it.” That psychology has kind of carried over to our adult lives on some level.
TE: But in this case, there is someone else to blame. In most of these cases, there is actually an attacker who has malicious intent. It's interesting that with the existence of someone who you could actually blame, who has likely committed a criminal act, we still often choose to sort of try and deflect blame from ourselves onto something that isn't that attacker.
LF: If you're an organization that has absolutely no semblance whatsoever of a plan of what to do, the chances of you having really robust cybersecurity defenses or thought-out asset management plans and risk assessments are probably pretty low. Generally speaking, if you're really that unprepared, you probably haven't done what you should have done to protect your data in the first place. So, you're probably also aware of that fact, and that might come knocking at your door.
TE: That's certainly true. That first thing that happens is that instinct to lay blame somewhere else and where to hide the incident. Is that the first mistake that you see unprepared organizations make, or is there another approach where they generally avoid that particular mistake and make some other first mistake?
LF: It can go either way. Sometimes, they go for it, and they sort of say that nothing's happened. We can all think of cases in the last couple of years where we've seen big organizations apply that methodology. What I tend to find is the thing that is really obvious to the public is that the communications are really terrible. The company either doesn't put out any communications at all about what's going on and how they're trying to fix the situation, or they're really muddled. Another thing you'll see happen with an unprepared organization, for example in a ransomware attack, is they'll come out immediately and say that they are not paying the ransom. That's always a red flag because it highlights that they probably aren't considering it from a business perspective. On a moral level, we all deeply think that this is not something you should do. However, it is a legitimate business consideration. Often, you then find they paid the ransom and then they have to backtrack, which also looks really terrible. Communications are something that you can't just kind of ad hoc and make up on the fly. You have to really think them through.
TE: In the case where an organization says that they are not going to pay the ransom and then end up paying it, how much do you think that the public realizes that the payment occurred? I find myself wondering if that initial statement of not paying the ransom is what gets remembered and reported and then the eventual payment of the ransom is sort of a secondary event that doesn't really get any publicity.
LF: It depends really on the situation. The cybersecurity and the infosec communities are pretty tuned-in, and it becomes pretty obvious that someone's probably paid the ransom when suddenly everything is restored very quickly. I don't think it's something you can necessarily hide, but again, it goes to indicate that an organization didn’t have a great plan. One of the things I've been doing a lot this year is writing ransomware policies for companies. I include the scenario that contemplates in which circumstances a company may need to pay the ransom for the purpose of getting the Board to pre-approve that sort of scenario. That's the sign of being prepared, for it's very, very rare that I see a company declare that they will absolutely never pay a ransom.
TE: Let's shift a little bit from the first mistake that organizations make when they're unprepared. What's generally the biggest mistake that you see unprepared organizations make?
LF: Unless your company is particularly unlucky or particularly incompetent, you're probably not handling a crisis every day. It's probably pretty rare as an occurrence. What that means is that the people who are on your crisis management team have very little practice in running a crisis and handling pressure, making decisions, as well as delegating and working as a team. If you haven't planned and run exercises, what you end up finding is that there is no real leader in the crisis management team. It's chaos because no one owns responsibilities. The middle of a cyber-attack is the wrong time to realize that.
TE: How many organizations really actually have a crisis management team to start with? Larger organizations may have the luxury of those resources or at least some framework, but smaller and mid-sized organizations might not have any such team.
LF: With business continuity managers, often in very large organizations, you'll have an individual who has that role, but then in smaller organizations, the chances are if that role exists at all, it's probably lumped with the CSO’s role or the head of IT. They've got a lot of other things that they have to do, so it's a tiny part of their actual role.
TE: Not an area of focus for sure. Exactly. That gives us a little bit of a picture of the unprepared organization and what might happen. You described that initial response, which is crisis-driven, and the biggest mistake, which is really a lack of planning. But what's important for a lot of organizations in terms of how to deal with a breach, or an incident, are the outcomes. What are the preventable outcomes that occur when an organization is unprepared? How does that lack of preparation impact them in a meaningful way?
LF: I think one of the most meaningful ways that you see is that the decision-making of the crisis management team is hampered to such a large extent when there is no planning and no rehearsals. What that ends up happening is they either make really bad decisions, which is obviously not good, or even worse than that, they make no decisions at all, and they're almost completely paralyzed. They don't know where to get information. They're not well-versed on, for example, if their service is down, what that impact is from a customer-facing perspective, and how to build in some resilience and some redundancy to that process. So, because they have no familiarity with how anything works and how to make decisions, what you find is that no decisions get made, and it's not the sort of incident where you can do that. You know, it's a fast moving incident. It's an evolving incident. It really relies on people who are confident decision-makers, who can keep the organization moving forward.
TE: All of that indecision and inaction ends up prolonging the breach, causing potential loss of data and loss of customers. It all comes back to that incident ultimately costing that business more money than it needed to. Is that a reasonable way to think about it?
LF: Yeah, definitely. I talk about this a lot when I talk about what the losses are from cyber-attacks and from breaches. One thing that you can almost certainly depend on happening are recovery costs, and they can be substantial. Smaller businesses will often underestimate this. Businesses will also not necessarily have already brought on forensic companies, for instance, or lawyers who they've already on-boarded. So they're then forced to hire these resources under extreme pressure in a crisis where you just don't have negotiating power to drive the price down. A secondary event that flows later on from the breach is the possibility of class-action lawsuits that might come from consumers, which again can become quite a large amount of money.
TE: Yes, that is a good point, that there's also that aspect of creating liability through poor decision-making or a lack of decision-making that turns into that kind of a lawsuit and ultimately has significant cost on the line.
LF: This is exactly also why seeking legal counsel early on is really, really valuable – especially when you're having that legal advice during an incident that might be subject to legal privilege, which means it's not necessarily discoverable in a legal sense if a class action lawsuit ensues. If you're getting advice from a lawyer on what you should do, it is likely to be deemed “privileged information.” So people won't be able to see it. Whereas, if you're not working with a lawyer early on, all of that information is potentially subject to discovery.
TE: The other costs that people don't necessarily think about is the opportunity cost of that breach. All of this money that is spent to resolve the breach is really unbudgeted funding. It's not money you planned of spending because nobody's planning to spend money on a breach. That means it's coming out of somewhere else such as the budget to grow your business.
LF: Definitely. Organizations haven't necessarily budgeted for it, nor have they thought it through. For example, with ransomware, people don't necessarily realize that you can't just set up a bitcoin wallet in two seconds, buy bitcoin, and pay ransom in two seconds. That doesn't happen. Something like that would probably need Board approval. It's going to take time to set up the bitcoin wallet. Many organizations wouldn't even know that. And the longer the payment is delayed, the more likely that the attackers will raise the price.
TE: I might start asking people what their budget for ransomware payments is and see what kind of a response they offer.
LF: Many organizations would not risk holding such a volatile asset.
TE: That's right. Yet, in survey after survey, ransomware shows up as the top concern that that practitioners have today. It would be reasonable, given that it's a top concern, for organizations to actually set aside budget for paying ransom, but I'm pretty sure they don't. It's an interesting little contradiction.
Let’s move on from that unprepared organization to the well-prepared organization. Although you specialize in helping organizations prepare, that means you probably encounter more unprepared folks than prepared folks, or maybe you leave them better prepared than when you arrived. What are the key differences that you see when you encounter a well-prepared organization versus that unprepared scenario?
LF: There are two categories of well-prepared organizations. There are actually well-prepared organizations and organizations that think they are well-prepared. The two are not the same. The people who think they're well-prepared usually will have absolutely beautiful plans and playbooks for everything. It looks great on paper. Absolutely. Perfect. However, when you run an exercise, you will realize very, very quickly that the plans are usually over complicated. People don't understand them. They're unusable. The plans make presumptions that just simply aren't based in reality. In practice, the plans fall apart. The truly well-prepared organizations are ones that have run an exercise, improve their plans, run another exercise, and improve their plans further. It's that ongoing process that leaves them in a position where their crisis management team is super confident. The plans are really robust, and they know exactly where their weaknesses are.
TE: This seems like an important distinction, that being well-prepared doesn't simply mean having a response plan. It actually is a process of continually exercising that plan and continually improving it as you go through those exercises.
LF: Yeah, totally. Because often – You know, I'm guilty of this, and I'm sure everyone is guilty of this – you will write something down that you think is absolutely genius, but when you try to execute it or bring it into practice, you realize that certain bits of it work academically but aren't practical. That's why you have to exercise the plan and test the scenarios.
TE: We create things for ourselves that might work with the assumption that we're the one using it, but everyone is different. So, having a plan that actually anyone in the organization as part of that crisis management team could effectively execute is actually a really challenging task.
LF: It is. The other thing that's perhaps even more challenging is that in a lot of organizations, they have good inter-departmental representation in their crisis management practice, but the team leadership is not as well thought out. The default in a lot of organizations is to allocate the CEO to that position, but that's just purely from a hierarchical, superiority perspective. Actually, whoever lead the crisis team needs to be the most capable individual under pressure. That may well be someone other than the CEO. It depends on individual characteristics, which you won't know until you've run exercises.
TE: That's also an interesting point, that those exercises might change the plan itself, but they might also change the roles that people play based on the results of those exercises. Often, when we think about job roles, we think about them as being fairly immutable, but finding who the best person is to fit that that lead role in a crisis management team is different than a job assignment.
LF: Yeah, there are some people who are absolutely excellent at their day-to-day jobs. Absolutely superb, but just simply can't handle a crisis. You don't want to discover that in the middle of an absolutely crucial cyber-attack or or data breach.
TE: Yeah. That seems like a really tough scenario to play out because no matter how realistic you make the tabletop exercise, it's never going to have the same level of pressure as a real incident. Do you have any tips for how to resolve that, how to simulate the kind of pressure that you might experience in a real incident?
LF: I always say that the purpose of these exercises is to build confidence. You want to build confidence in your crisis management team. It's always good just to start off with a tabletop exercise with a gentle scenario. As they get more used to running exercises, you can build them up. As an example, on one of the exercises I conducted a few years ago, we had fire, ambulance, and police involvement in it. It was in a shipping port. So, we had simulated bodies with blood everywhere, and we had drones, and we hired a load of people to play journalists. We actually walked the CEO out of the building and had him accosted by journalists who were trying to ask him questions. Obviously, if your crisis team has never run an exercise before, and you start at such a high level, you are probably going to get an incredibly bad response. So, it's about building the confidence of those people and then introducing more immersive elements into the exercise.
A serious simulation is a great awareness-raising tool for your staff generally, as well, because they can see that we're playing this game where we're pretending we're under attack and this is what we're doing. It really helps you reinforce some of those security awareness training lessons that you've implemented in your general population of staff, as well.
TE: How do the outcomes differ when you have a well-prepared organization?
LF: As an example, we can look at the Maersk shipping company ransomware incident. What was really interesting was that after their ransomware attack, instead of the media jumping on them and accusing them of being unprepared, their IT teams were labeled heroes by the media, which is just unheard of from a cyber-attack perspective. Maersk’s teams did two really interesting things, and they did them absolutely excellently. The first one was that they communicated completely openly and transparently to the public. The second one, which is actually more interesting, was they turned to their staff and they said, "You do whatever you think is right to serve our customers, and we'll pick up the bill later. Just make sure our customers are happy." That delegation meant that the senior people on the team didn't have to keep getting questioned about how to respond to the attack. They just went and did it.
TE: Yes, that is interesting, and I'm sure there were mistakes that were made in that process for Maersk, but those are overshadowed by the positive performance. If they had worried about those individual mistakes, they would have ended up with a much worse outcome, ultimately.
LF: Totally. The chairman of Maersk actually said that he thought that they were probably average at best at cybersecurity and that they had learnt a lot of lessons from the cyber-attack. I just think that honesty really resonated with the public. It just seemed really refreshing.
TE: It's also interesting because we started this conversation about how an organization hasn't done the best job of cybersecurity in general is more likely to want to lay blame somewhere else or to hide the fact that there was a breach. Yet, it sounds like Maersk, if they said they're probably average, maybe that's enough. There has to be a threshold ther at which that level of transparency and honesty is acceptable, for if they were incredibly poor at cyber security, I think the outcome might have been different.
LF: I see your point. It is especially important for an organization to control the narrative so that the public doesn’t come up with their own theories. What happens then is that the organization has to put out fires by correcting the false stories. This is why we write our communication templates before the incident happens as opposed to when we're in an emergent situation.
TE: The takeaway here is that the biggest difference between an unprepared and a well-prepared organization is doing that planning. But not just having the plan. Actually running the exercises. If you're not doing that, then you're not really prepared.
LF: That sums it up perfectly.
TE: Lisa, I want to thank you for the time. It was a really interesting conversation. I appreciate it.
LF: Thank you so much.