Is Staff Burnout the Best Reason to Implement Cybersecurity AI?

Many in the cybersecurity workforce can’t keep up with technological change and are too busy to learn about the latest threats. Some are even so burned out that they are leaving the industry entirely. These are some of the findings of a June 2019 study by Goldsmiths, the University of London and Symantec, the results of which should not only worry those who work in the cybersecurity space, but everyone who relies on a computer to do their work.

If there was ever a need to talk about integrating cybersecurity AI into your enterprise, it’s now.

Why the Security Burnout?

Last year, the Ponemon Institute released a report titled “Separating the Truths From the Myths in Cybersecurity.” Perhaps one of the most interesting lines from the report was that organizations are “suffering from investments in disjointed, non-integrated security products that increase cost and complexity.” This is a very telling line, because the people who ultimately have to work with the disjointed, nonintegrated security products are your security staff.

Imagine that you are on the front lines of cybersecurity now: Not only do you have to deal with a mountain of alerts and responses every day, but you also have to untangle these security products. Frustration and burnout are inevitable. It’s no wonder cybersecurity jobs are hard to fill and retaining those workers is equally difficult. And it’s no wonder Cybersecurity Ventures predicted that there will be 3.5 million unfilled cybersecurity jobs by 2021.

We are dealing with three inevitable situations:

  1. Malicious cyber activity continues to rise, whether it is from nation-states or criminal actors;
  2. Reliance on technology is increasing, not decreasing; and
  3. New technologies, such as 5G, mean more data is going to be produced and retained.

What does all that mean? It means that now, more than ever, we could surely use an assist. And oh yes, there is a fourth inevitability worth mentioning: the malicious use of artificial intelligence (AI). There’s just simply no way around it: Just like you shouldn’t take a handful of tissue paper to a Nerf ball fight, you shouldn’t be fighting your AI-equipped adversaries without some AI of your own. Your stressed out employees may be your best business case to implement cybersecurity AI.

Who, or What, Helps Share the Burden?

Now, there are those who are a bit reluctant to implement AI. And they do have legitimate questions, including how certain training methods are being used; what data the AI has been trained on; what data the AI will analyze; and who ultimately has control, the machine or the person?

These are all good questions to ask, and need to be asked, before an enterprise considers integrating AI into their security posture. Why? First, it helps you understand your business processes and forces you into a risk-based decision-making frame of mind. Next, and perhaps more importantly in this case, asking these questions helps you avoid any more of those disjointed, non-integrated products mentioned above, because otherwise, you just end up building more fragility into your system.

In fact, your desired state should be antifragility. Put another way, you become stronger from the attempts that try to break you. Enter cybersecurity AI.

Artificial Intelligence Is a Tool, Not a Crutch

With the cyberthreat landscape being what it is, really all we have here is a simple case of piling on. You see, if you use cybersecurity AI like a surgical tool, you begin to lighten the burden on your staff. AI can do things like munch away at mountains of data at incredibly high speeds. Therefore, at least in theory, the result of cybersecurity AI doing the heavy lifting should be:

  • Staff becoming more efficient in their productivity, since they no longer feel overwhelmed; and
  • Staff having increased ability to keep up with new threats and technologies.

But the key here is ensuring that your AI solution is in fact used as a tool and not as a crutch, something we all can be reasonably guilty of when integrating technology into both our professional and personal lives. To be most effective, cybersecurity AI needs to team up with the cybersecurity workforce, not replace it.

Your Cybersecurity AI Will Only Be as Good as the Data It’s Trained On

This section header somewhat speaks for itself, but also reinforces the need for the “human touch” when we begin to integrate more AI into our security practices. The AI will be fantastic for fast incident response, risk identification, prioritization, automation and scalability, but it is those who hold the cybersecurity jobs today that need to make sure the AI doesn’t go off course.

To be clear, this isn’t necessary because the AI will have some flawed algorithm — that’s an entirely different problem. Rather, it’s a matter of the cybersecurity AI having a guide to make sure it’s doing the right thing, because here’s the real kicker: Unless we decide to give full control to the machines (insert your favorite post-apocalyptic machine-run world movie here), we will be making decisions based on what the AI recommends.

This is the danger zone, because if we’re not careful, the AI shifts from being a tool to a crutch. And the more we lean on a crutch that may have some crack in it, the harder the fall will be when it snaps.

It’s a Team Game

Just as organizational leadership and security leadership need to team up to determine the best security solution for the organization, analysts need to team up with AI to determine how to manage all the alerts and responses. When you’re feeling bogged down and you have tight deadlines, such as privacy notifications or getting a system back online, managing all the data will just feel like a mountain on your shoulders. That alone may be the best business case to get yourself a surgical AI tool.

With a lot of next-gen technology around the corner, this may be the time to do a wholesale upgrade of your operations. Done correctly, the intended results should be:

  • A better understanding of your business processes;
  • Decisions made from a risk-based approach;
  • Happier staff, ready and able to be more productive; and
  • A next-gen solution that integrates AI and washes away those disjointed, nonintegrated products.

Your staff will no doubt appreciate the upgrade in offensive and defensive capacity. In fact, they may appreciate it so much they’ll see no reason to look for other cybersecurity jobs. With 3.5 million unfilled jobs on the horizon, holding on to these immensely qualified people may be critical for your enterprise.