Cybersecurity

Can Reducing Risk Be Bad for Us? Yes, If Your Rules Lack Flexibility

Charlie Platt, Director of Data Analytics for iDS, resumes his Ethical Hacker column with a piece on how reducing cyber risk can get in the way of your business – that is, if your rules lack flexibility. The best way to implement successful cyber risk programs without hurting your business, he says, is to design them to adapt to dynamic business requirements by providing an approved exception process.

It’s been a while since I’ve been on these pages. I’ve missed it and it’s good to be back. One big change is that I’m now also focused on data analytics in addition to cybersecurity. I will be heading up the Data Analytics practice at iDS and Robert Kirtley is heading up the Cybersecurity practice. Together we will be talking about how data analytics and cybersecurity go hand in hand, and how we can assist each other in achieving great results for our clients.

In light of that new focus, I’d like to tell you about a project I recently worked on for a client. While on the surface our work was focused on data, there were strong undercurrents of cybersecurity throughout the project. We were engaged to assist the client in extracting critical data from a cloud provider system. Arguably, that’s a data problem. But the client needed iDS because their original contract with the cloud provider had very little language about how to end the relationship and get their data out. That’s a cyber problem, and one that bears further discussion – how to avoid or mitigate vendor lock-in could be an entire column itself. But that’s not what I want to talk about here. I want to discuss how your cyber program can inadvertently get in the way of your business.

A cyber program that can’t adapt to dynamic business requirements by providing an approved exception process forces the business to take on unacceptable risks.

The client in this instance has a very mature cyber program. Employees are focused on cyber and alert to the risks, clear and consistent messaging is prevalent throughout the workplace, and a culture of security is evident. In almost every way the program is enviable and something we should all be striving for, with one exception – it lacks flexibility and has difficulty adapting to new or unusual circumstances. In short, it’s a great program but it has no means for exceptions to the rules.

At this point most of us are probably thinking, isn’t that a good thing? We shouldn’t have exceptions to security rules. And for the most part that is correct, but exceptions do play an important role in security. It’s not that we can’t or shouldn’t have them; it’s that we need to manage them and not make exceptions the default state. Take, for example, patch management. We all agree that regular patching is important, and that a key part of security is keeping our systems properly updated with the most recent security patches. But that also runs contrary to the security concept of planning and testing prior to introducing changes into production environments. And so, when a patch comes out for a zero-day vulnerability where an exploit is known to be in the wild, we go ahead and patch as fast as possible. In this instance the risk of not patching exceeds the risk of making changes without all of the testing and review being completed.

Which brings me to my recent engagement. The project had several real world constraints that gave us a tight window for success. We did what we could to mitigate these, but for the most part we simply had to accept them and work within them. On top of this, the client’s cybersecurity policies added additional constraints, such as use of a specific VM with only two remote sessions, severely limiting the number of users who could work at any one time, and for all work to be done on site. These are all good cyber policies, and we were working with sensitive data, but the policies caused the costs to rise dramatically and introduced a real risk that we might not complete the project by the drop dead date.

In this instance the client’s problem had more than a single solution, and it made sense to pursue more than one at a time to reduce the risk that one would fail. Unfortunately, alternate solutions ran into cyber policy roadblocks, mostly waiting for approvals and resources, leading to additional costs and risk of failure. Ultimately, the client’s cyber program resulted in more than doubling the cost of the project and introducing significant risk of failure. The cyber concerns involved were real and should not be disregarded, but they could have been addressed with an exception handling process, rather than the policies being rigid and inflexible.

Cyber should be a servant of the business, not the other way around. A cyber program that can’t adapt to dynamic business requirements by providing an approved exception process forces the business to take on unacceptable risks. Our cyber programs need to recognize business risk as well as cyber risk when developing policies and allow for increased cyber risk and doing so causes a significant reduction in business risk. iDS can help you with both sides – business risk related to your data as well as cyber risk and cyber program development.

Published .