Technology

The Cyberthreat Within: As companies wake up to insider threats, C-suite executives are looking beyond IT for help

Two of the most insidious myths about cybersecurity are that most threats originate outside an organization – Russian hackers, for example – and that it is an IT problem. Amie Taal, a cybersecurity expert with Deutsche Bank, Jenny Le, who runs operations for e-discovery provider FRONTEO, and James A. Sherer, chair of BakerHostetler’s information governance practice, are determined to dispel those myths. Below, they discuss their recent white paper focused on insider threats and the cyber responsibilities of C-suite executives (“Increased C-Suite Recognition of Insider Threats Through Modern Technological and Strategic Mechanisms”). Their remarks have been edited for length and style.

MCC: There was a day when corporate executives could walk the factory floor to keep an eye out for people and process problems. In today’s knowledge economy, executives have limited visibility into employees’ day-to-day processes and tools. At the same time, government regulators and others are putting insider cybersecurity threats squarely on their radar. That seems like an impossible situation. How should C-suite executives deal with it?

Sherer: Let’s start with the metaphor of the factory floor. What can we do to recreate that kind of environment when we’re not manufacturing something tangible? What can we do to walk a virtual factory floor? How can you use big-data collection and analytics to tell you when something’s going awry or to recognize that things are going positively? Does it mean that we set up our metrics wrong in the first place? Does it mean that something bad is happening that’s otherwise hidden? Or does it mean we should speak to the rest of the organization about something that’s going right and what we learn from it? And how can we use new technologies to improve our overall processes?

Taal: One of the biggest issues is responsibility. Until 2011, most stakeholders in an organization believed anything to do with hacking, attacks or insider threats was an IT responsibility. It wasn’t something the C-suite needed to get involved in. A large U.S. survey in 2011 asked who has responsibility, and 80 percent said it was IT, with only 43 percent saying the C-suite has some responsibility.

Fast-forward to 2016. Things have changed. The wake-up call was in 2015, when the FCC fined AT&T $25 million over an incident in a number of their call centers in which three employees stole and sold 83,000 customer records. In addition to the fines, AT&T had to appoint a senior compliance manager, contact all customers that were affected, put controls in place, implement an information security program, and train employees on privacy policies. These measures to protect their customers’ private data and file regular compliance reports with the FCC governance and controls should have been in place.

I am seeing a difference now. C-suite executives are engaging with stakeholders to protect the organization not just from insider threats but also from other attacks. Regulators in the U.S. and EU have issued guidance and disclosure obligations. You have the EU General Data Protection Regulation (GDPR) coming into effect in 2018, under which fines could reach 4 percent of an organization’s gross income. That’s for breaches of EU citizens’ PII (personally identifiable information) data, not small change for any organization. We are going to see a lot more interest and involvement from the C-suite.

Le: It’s both an awareness and interest issue, and a growing accountability issue as well. The Global State of Information Security Survey measures how involved the C-suite is in these issues. Results from 2015 showed that only 38 percent of CEOs promoted cybersecurity as a corporate governance imperative and not simply an IT issue. An even lesser percentage understood the organization’s information security technologies. A C-suite’s focus on bringing together teams, sharing information and promoting proper communication is key to tackling these challenges. We’re only going to see this focus increase with the growing regulatory pressures and responsibilities.

MCC: As the pressure and responsibility ratchets up, executives have to balance their usual business responsibilities with their special cybersecurity and insider threat responsibilities. Your white paper suggests that those roles can be in conflict. Explain this tension and how executives can deal with it.

Taal: There’s definitely a conflict. When you look at the impact of the insider threat, it affects the bottom line and an organization’s reputation. That’s where the interest lies. Executives are making it a priority and trying to balance that with all of their other responsibilities. The difference is that it’s no longer just an IT responsibility, nor is it a C-suite responsibility alone. The responsibility lies with all stakeholders within an organization, with the C-suite having visibility into what’s being done to protect the organization, rather than another responsibility they’re taking on. It’s about getting the buy-in from the C-suite and all other stakeholders in a collaborative effort to support the C-suite.

Le: The more interest the C-suite has, the more commitment you’re going to get from other groups. That promotes communication and a collaborative environment. Putting together a program to address insider threats, or cybersecurity generally, requires diverse groups within an organization to come together and focus on a common goal. That’s difficult. Sometimes that means not only learning new tools and technologies but also considering new applications for existing tools. That’s difficult because it is not their traditional focus. Commitment from the C-suite is going to make a huge difference.

Sherer: The threat profile is high enough that C-level executives understand it is important on its own, and it can be important to their careers. That’s incentivizing more active effort. Executives are no longer saying, “I don’t need to know about it because it’s highly technical.” Instead, they’re saying, “This is very important to our business.” There may be a better appreciation that it’s time to get trained, to understand the basics, and to say, “I get this. This is important to the organization. I’m going to lead. I’ve got the right people to whom I can delegate this as an important strategic matter, rather than just reacting.”

Taal: Executives don’t really need to know the nitty-gritty about how things are done on a day-to-day basis. They need to understand the impact. People still believe that cyberhacking is about outsiders rather than insiders, but 80 percent of incidences are carried out by insiders during work hours. There’s a trust element within the organization where executives are simply hoping things don’t happen. Things do happen, so education and training are paramount.

MCC: The technologies available to workers to increase productivity can surface data that drives better corporate decision-making. But those benefits carry some real legal and business risks. Give us some sense of what that environment feels like for the C-suite executive. How do they strike an appropriate balance?

Sherer: There’s some very good marketing that frames big data as the solution to every problem. It’s pretty close to magic, especially when the marketing incorporates brand-new words and spoon-feeds the solutions to the audience. Like anything else, it can be very difficult to parse exactly how data analytics work and what they offer. Understanding what you’re ultimately seeing and how that’s derived are both very important. Users should also consider that big data analytics carry some risks and related questions. How great is your data? Are there problems that could lead to fundamental misapprehensions? How good is your data governance?

Users are creating new sets of important and sensitive information that can improve decision-making but also present their own challenges. How do users manage and protect data sets? The process may create a set of information defined as “the crown jewels.” That’s one more thing to protect. How do you use that data in ways that aren’t going to infringe on civil liberties or cause undue influences? Much of it is issue-spotting and asking why. Users are looking for a level of comfort to say, “We’ve done our due diligence. We can use this appropriately. We can protect it now that we know what it is, what it comprises, what it means to us, and what it would mean if we lost control of it.”

Le: The challenge is understanding what you’re looking at and making sure that the reporting you generate based on these data sets answers the questions you’re asking. The use of artificial intelligence and advanced analytics can quickly make sense of large data sets, but misapplying the technology can create more work than if you hadn’t used it. That data can end up leading you down a rabbit hole. It’s important to understand the technology you’re using and to make sure you have the support to properly read that data. We offer a host of technology solutions for big data management and analysis, but our focus is on helping our clients understand what tools to use when, and how to use those tools most effectively. We have helped many of our clients implement processes using artificial intelligence tools to be able to detect and report cartel activity so as to satisfy the leniency application requirements at the Department of Justice. In those projects, we are not just implementing technology but also working with several internal teams (IT, compliance, legal, etc.) to help ensure everyone understands how to use and interpret the data that is both captured and generated.

Taal: In addition to the technology, you also need what we call a “behavioral analytics plan” to know how you’re going to find what you’re looking for and how you can support what you’re looking to do, since you have employees who are going to use the technology on a day-to-day basis. How do you store that data? How do you protect it? Can you use that data for the particular thing you’re using it for, especially if it was acquired for something else? There’s a lot to consider. You have the technology, you have the data, and it can be used to get some answers and results as long as it’s reliable, you support it, and you have a plan and controls in place to work with the data for the purpose at hand.

MCC: Awareness of insider threats seems to be growing, but we’re not really there yet. Most of what we read is about outside threats such as Russian and Chinese hackers, but most studies show that more threats come from the inside. And they’re not all of one piece. How does the nature of the insider threat impact an organization’s planning and response?

Sherer: Let’s start with the people whom perhaps you’re not thinking of. It’s easy to conjure up visions of malicious insiders – someone in your organization trying to send data to a foreign power or sell it to your competitors. They’re certainly a threat that needs to be dealt with in a strong fashion, but that malicious insider is the exception to the rule. Most people aren’t intentionally trying to take down the company from within. But what we do have are people who are dealing with a flood of information and the advent of new technologies, and they sometimes simply don’t know how things work. These gaps in understanding can lead to gaps in process, which lead to potential vulnerabilities. These people may inadvertently do something that exposes data, or an outsider may exploit the processes or the people. The BYOD (bring your own device) movement is a good example of this challenge.

Taal: In any organization, you have the intentional and the unintentional. The majority of acts are unintentional. Part of it is a lack of training. Part of it is a lack of ownership and taking responsibility as a stakeholder in an organization. You also have contractors, third-party service providers, even ex-employees with continuing access because accounts and passwords haven’t been closed down and controls are not in place. We also need to worry about social media. Organizations are putting controls in place because they see employees going on Facebook or Twitter and writing things that should only be privy within the organization. Those are the unintentional ones. You also have individuals who are disgruntled, and they just want to cause a denial of service, bring the system down, or cause havoc by installing malicious software. And there are people who abuse their permissions, accessing unauthorized areas. You have to look into all of that and make sure that whatever you put in place, you’re being mindful of intentional and unintentional acts. It goes back to responsibility. Whose responsibility is it? Who should be involved in protecting the organization with regards to these types of insider threats?

MCC: How important is it for organizations to understand where they stand regarding the various benchmarking standards? Which are most helpful?

Sherer: When we’re benchmarking with clients, we look at a number of different avenues. It’s not so much the ISO standards (published by the International Organization for Standardization and the International Electrotechnical Commission), though they can certainly come into play. We’re providing clients with a sound method they can use to spot issues in their environment. We benchmark against prior organizational experience as well as industry-specific experience. One of the nice things about this area of practice is that companies are in it together. They are not fighting each other. They’re establishing industry norms and learning from each other.

Le: Our company is certainly held to the highest security standards, since we’re housing data and working with sensitive client data every day. We are both ISO27001 and PCI DSS certified. In fact, many of the requirements and cross-checks that go into these certifications can also be useful for organizations that aren’t required to be certified. The challenge is communicating with all employees in an organization, and especially with those people to whom it matters most. When we talk about unintentional disclosures of data or insider threats from employees, that’s where the issue is. Internal policies or benchmarking standards are only as good as your people’s understanding of where the risks are and what the proper practices are.

Taal: It’s all about training. You can have processes and policies in place until the cows come home, but if it’s not communicated properly, if it’s not updated properly, if people are not trained properly to understand the policies, procedures, and processes, and if there’s no monitoring process to assure compliance with the policies, they don’t really work. Insider threat programs should be separate from IT security programs. Most organizations still have intentional and unintentional insider threat plans lumped together under IT security. The guidance, in addition to ISO and all of these benchmarks, is to have a standalone insider threat plan and to educate the C-suite and others on that plan.

MCC: This is an area that has many executives dreaming of magic bullets. There must be some technical solution, some process, some psychological insights being developed that can really help here. Looking into the future, what do you see as the most promising approach to the insider cyberthreats companies are facing?

Taal: There’s never going to be a one-size-fits-all solution. There are a number of things, some of which we’ve touched on. Organizations have to have a separate insider threat plan and be very aggressive with that and gain C-suite endorsement. Most organizations focus so much on the technical behavior of employees that they neglect nontechnical behaviors such as employees working overtime, never taking a vacation, or asking questions about things they’re not authorized to know. It’s always good to have a focus on both technical and nontechnical behaviors. Other things include: using behavioral analytics tools to help identify psychosocial events that may be warning indicators; a partnership with all of the stakeholders (IT, HR, legal, compliance, even physical security, and definitely leadership by the C-suite); an incident management plan for insider threats so you can study the attacks and learn from them; and, last and most important, you need training from the top down and bottom up on insider threats – what they mean, what the impact can be, how they can be prevented. If an organization does these things, life should be easier. But the insider threat is here to stay and is not going away.

Le: You have to understand your organization before you can address the issues and what kind of plan or process you should put in place. That can be a challenge. What kinds of technologies are your employees using? Where are the risks? An organization looking within itself to understand the existing technology being used, as well as what employees may be using that may not be officially sanctioned, can begin to consider how to mitigate potential risks; that’s a step forward. This is going to require collaboration across teams. Some of the technologies we offer are now being used to mitigate risks by organizations that have already been reprimanded by the DOJ. That’s what’s driving a lot of our clients toward these kinds of programs. Of course, organizations that have not yet gotten in trouble but understand there is an incentive to having technologies and processes in place are going to be one step ahead. That said, many other incentives should drive organizations and especially the C-suite to take these issues seriously and make time to understand what other organizations are doing. Even in the absence of regulatory investigations, accidental loss or mishandling of confidential or private information can be detrimental to an organization.

Sherer: Technology is great and can help in a lot of ways. But things change so quickly that it’s difficult to imagine one approach that solves everything and could do so in a static fashion. And despite the marketing we’ve seen, I’m still not aware of any company that purports to do that. Instead, it’s about understanding what your organization is doing, and that’s a dynamic process. You can only get a snapshot, and then it changes – things move that quickly. You need to understand your environment and also pay close attention to what’s happening outside of it. That’s very helpful for benchmarking and best practices. What challenges exist, and how are other organizations dealing with them?

The other thing I try to advocate is the creation of processes where compliance is “built-in” but employees or users are allowed to “opt-out.” Process can make it difficult for people to be out of compliance. Look at email, for example. Process can direct employees to save email for only a certain amount of time, so that they only get a certain volume, with exceptions for special situations such as legal holds. It curbs employee behavior. Then the same complacency that allows people to save everything and never delete it instead leads employees to work within the system and manage their email accordingly. This active consideration of how things work directs behavior in positive ways and avoids letting these processes or new technologies evolve organically to a point where an unthinking status quo can lead to bad behaviors that present additional challenges further down the line.

Published .