Live Cyber Attack Lab 🎯 Watch our IR team detect & respond to a rogue insider trying to steal data! Choose a Session

X

CEO vs. CISO Mindsets, Part IV: Monte Carlo Breach Cost Modeling for CISOs!

Data Security, Threat Detection

My main goal in this series is to give CISOs insights into CEO and board-level decision making so they can make a winning case for potential data security purchases. In my initial dive last time, I explained how CISOs should quantify two key factors involved in a breach: the frequency of attacks, and then the probability that the breach itself exceeds a certain cost threshold. Knowing these two ingredients (and that there are numbers or ranges of numbers you can assign to each) will earn you points with CEOs and CFOs.

It’s second nature for the top corporate honchos to make decisions under uncertainty: they are pros at placing the right bets and knowing the odds. And CISOs should understand the language of risk, and how to do some basic risk math.

Sure CEOs should also have basic knowledge of the amazing post-exploitation tricks hackers have at their disposal, and I’ll take that up in the next post. But I think the larger gap is getting CISOs up to speed on biz knowledge.

As a bonus for CISOs and tech execs for getting this far in the series, I’ve put together a wondrous Excel spreadsheet so that you can do your own, gasp, Monte Carlo-style modeling! You’ll truly impress CEOs in your next presentation by tweaking this simulation for your particular company and industry.

Let’s Be FAIR

I’m a fan of the FAIR Institute and its framework for analyzing risk. The FAIR gang are excellent educators and guides into what is, ahem, a very wonky topic. You can go as deep as you want into the FAIR analysis, but as I described in the previous post, even a shallow dive can provide very useful results for making decisions.

At the first level of FAIR’s analysis, you need to look at the two factors I mentioned above. First, derive an exceedance loss curve for your particular industry or company. In my case, I was able to use a public healthcare dataset of breaches reported under HIPAA, and then apply results from a breach cost regression based on Ponemon’s breach survey.

I’m able to say what percentage of healthcare breaches fall above any given cost amount for a single incident.

By the way, a similar type of curve is also calculated by insurance companies for auto and home policies. It’s the same problem! For them a large claim is similar to a costly data breach. Ultimately, insurance companies use the loss exceedance curves to work out premiums that help cover the insurance costs and give them a profit. And we can think of the cost of a data security software license also as kind of a premium companies pay to limit the loss of a breach accident.

Anyway, the second factor is the frequency or rate at which companies are breached. You can guesstimate an average rate, which is what I did last time for my hypothetical healthcare company. 

This does bring up a more important point: what happens when you have limited real-world data? Thankfully, the FAIR approach allows for this, and there are techniques to combine or weigh internal information collected by your infosec team — say, the frequency of successful SQL injections in the last 5 years—with any available public information from the external sources such as Verizon DBIR. This idea is partially covered in a video the FAIR people put together.

What do you do with both of these factors?

You multiply them: frequency X single loss equal total loss. Well, it’s not quite that simple!

Exact formula are generally not easy to come by for real-world scenarios. And that’s why you run a Monte Carlo (MC) simulation!

In an MC Simulation, you “spin the dice” — using Excel’s built-in random number generator — to simulate an attack occurring. And then you spin the dice again to generate a possible loss for an attack. You tally the losses, rank them, and produce a curve a representing the total exceedance losses for a given average frequency over a period of time.

In my MC simulation, I rolled the dice a few thousand times using an Excel spreadsheet with special Visual Basic macros I added. I modeled a healthcare company experiencing an average rate of four incidents over ten years, and a single loss curve based on the HIPAA dataset to produce the following total loss curve:

The total breach cost exceedance loss curve. The ultimate goal of the MC simulation!

This is really the goal of the simulation: you want a distribution or curve showing the sum of losses that occur when a random number of attacks occur over a given time period. Armed with this kind of analysis, imagine making a presentation to your CEO and CFO and confidently telling them: “There’s a 10% chance that our company will face a $35 million breach loss in the next 10-years.” Your CEO will look at you going forward with loving C-level eyes.

The key lesson from FAIR is that you can quantify data breach risk to produce a good enough back-of-the-envelope calculation that’s useful for planning. It’s not perfect by any means, but it’s better than flying blind. Think of it as kind of a thought experiment, similar to answering a Google-style interview question. And as you go deeper down into FAIR, the exercise of analyzing what data is at risk, its value, and red-teaming possible breach scenarios is valuable for its own sake! In other words, you might …  learn things you didn’t know before.

Value at Risk for CISOs

My analysis of the HIPAA data involved some curve wrangling using off-the-shelf stats software. I was able to fit the dataset to a power-law style curve — wonks can check out this Pareto distribution. Heavy-tailed curves, which are very common for breach stats (and other catastrophe data), can be approximated by a power-law like formulas in the tail.  

That’s good news!

It’s easier to work with power laws when doing simulations and crunching numbers, and the tail is really the most interesting part for planning purposes — it’s where the catastrophes are found. Sure CFOs and CEOs look at average losses, but they’re far more focused on the worst cases.

After all the C-levels are charged with keeping the company going even when the breach equivalent of a Hurricane Sandy comes. So they have to be prepared for these extreme events, and this means making the investments that limit the losses for catastrophic losses found in the tail.

And that brings us to Value at Risk or VaR.

Let’s just demystify it first. It’s really a single number that tells you how bad things can get. A 90% VaR for breach losses is the number that’s greater than all but 10% of all losses. A 95% VaR is greater that all but 5%.

In the curve above, you get the VaR by going to the y-axis, finding the 5% or 1% value and following the horizontal line to the curve, and then dropping down to the X-axis to get the value. It’s really a exercise in doing a reverse lookup. Hold that thought.

You run my MC simulation having inputted average frequency rates, and a single loss curve (or really the tail) based on a real-world dataset, and then let it generate thousands of possible scenarios. For VaR purposes, you and your C-levels are very interested in a select few scenarios — the ones that show up at the top of a ranked list.

Below you can see specific sample runs from my Excel spreadsheet for 90%, 95%, 97.5%, and 99% VaRs. So at the end of 10 years, the 99% VaR is over $120 million, and it turns out involves three events — notice the jumps.

Notice the huge jumps in the the 97.5% and 99% curves. It’s a feature (not a bug) of heavy-tailed curves.

The Mysteries of the Heavy-Tailed Dragon 

I lied. For heavy-tailed distributions you really don’t have to run a MC simulation to come up with some VaR numbers. There is a formula! 

I’ll hint at what it might be, but to see what it is in the case of a Pareto distribution, you’ll have to download the spreadsheet. The VaR formula enables you to do a quick napkin calculation. The MC simulation is still useful to verify the formula with simulated data based on your modeling.

For background on all this, there’s a surprisingly readable presentation for this mathy subject written by two statistics guys. They describe in simple terms some of the mysterious properties of these heavy-tailed beasts. Yes, dragons are magical. One of their stranger powers is that these beasts will womp you by a single crushing event. You can see that in the 97.5% and 99% VaRs in the 10-year simulation above. Notice there’s one huge jump in both these cases.

Another strange and magical thing is that a good VaR approximation can be calculated easily for many heavy-tailed datasets. I suggested it above. Essentially, you can think of the VaR as a reverse look-up. That means in math-speak the inverse of a formula. In the case of multiple losses that have a given rate or frequency over a time period, the VaR formula can be calculated by a slight tweak in the inverse Pareto distribution. You’ll have to check out my Excel spreadsheet for the true formula.

What else can you do with all this probability information?

You can start working out whether an investment in data security software will pay for itself — assuming the software prevents the attack. In my spreadsheet, I let you calculate a break-even percentage based on a yearly security investment. And I’ve also worked out the average payback — how much money your software defenses will save you, on average.

Data security software pays for itself! Here’s a worked out example for a $400,000/year investment assuming a heavy-tailed Pareto curve based on HIPAA breach data.

Let’s call it a day.

I’ll have a few more thoughts on VaR in the next post, and then we’ll get into basic knowledge that CEOs should know about post-exploitation.

I’ll end this post with a lyric from the greatest band ever, Abba of course, which I think brilliantly summarizes the devastating power of heavy-tailed breach loss distributions:

But I was a fool
Playing by the rules
The gods may throw a dice
Their minds as cold as ice
And someone way down here
Loses someone dear

Thanks Benny for this great insight into breach costs.

Download the breach cost modeling spreadsheet today!

Andy Green

Andy Green

Andy blogs about data privacy and security regulations. He also loves writing about malware threats and what it means for IT security.

 

Does your cybersecurity start at the heart?

Get a highly customized data risk assessment run by engineers who are obsessed with data security.