Everyone agrees that core protection tools are necessary from a risk management and compliance standpoint, but what about adding new programs to reduce your cyber risk? Quantifying your cyber risk is difficult but necessary to establish a prudent financial evaluation and planning process that provides you the right ROI.
There are so many unknowns, starting with probability of a breach and ending with estimating the impact. It is helpful to break these elements down and then work to quantify ranges for each them with specifics for your firm.
Let’s start with estimating your likelihood or probability that your organization will experience a breach or a successful attack. A successful breach requires an existing vulnerability in your firm that a threat (or bad actor) can find and exploit. You also need to then estimate the value of the underlying asset you are protecting. What is the cost of that asset getting compromised? When a valuable asset (ie: Application or DB with sensitive data or a client that has access to that data) has a vulnerability that can be exploited, the consequences can be significant.
Taking these points together, I think the best model for quantifying cybersecurity risk is the following:
Said another way:
$ Value of Cyber Risk = Σ 𝘱Threat X 𝘱Vulnerability X $ Value of Assets at Risk
- 𝘱Threat is successful based on level of sophistication and resources
- 𝘱Vulnerability is present and exploitable to produce a material impact
- Consequence = $ Value of the Asset(s) at Risk
Expanding on the three components of the cybersecurity risk formula:
1. Threats: The ability to deliver a successful attack or exploit against your organization.
The threat or hacker economy is growing every day bringing more sophisticated attackers with greater resources. A few of the macro trends are driving this phenomenon are: the global economic disparity across countries, state sponsored attacks and anonymous payment capabilities like bitcoin.
“The hackers of today are far more skilled, organized and well funded than ever before. As such, they are getting better at finding weaknesses, penetrating security barriers and enacting more damaging attacks once inside a company.”
OWASP defines the following parameters to estimate a specific threat:
- Skill level: How technically skilled is this group of threat agents?
- Motive: How motivated is this group of threat agents to find and exploit this vulnerability?
- Opportunity: What resources and opportunities are required for this group of threat agents to find and exploit this vulnerability?
- Size/Resources: How large is this group of threat agents and what resources do they have access to? (particularly relevant for growing state sponsored attacks)
The problem with this is you rarely know how is targeting your firm, and have to assume you will be dealing with numerous overlapping threats. Again, the greater the asset value to the attacker, the more they are willing to invest…
The exploit marketplace is exploding. Sites like Zerodium (focused on white hat hackers) post bounties for zero click exploits ranging from $500K to $1.5M. The dark web marketplace for malicious hackers offers much higher ranges.
Given the rising capabilities and scale of the threat economy, you should again assume that they will compromise your organization through one method or another.
2. Vulnerabilities: Weaknesses in your environment.
Software vulnerabilities are growing at an alarming rate as we product more software and reuse components or services that contain unknown vulnerabilities. The top 10 ISVs today have over 10,000 critical (level 8 to 10) active vulnerabilities and they continue to grow each year with another 1,600 level 8 to 10 discovered last year.
OWASP defines the following parameters to estimate your Vulnerability exposure:
- Ease of discovery: How easy is it for this group of threat agents to discover this vulnerability?
- Ease of exploit: How easy is it for this group of threat agents to actually exploit this vulnerability?
- Awareness: How well known is this vulnerability to this group of threat agents?
- Intrusion detection: How likely is an exploit to be detected?
On of the most famous recent vulnerabilities is the Apache Struts vulnerability that was ground zero for the Equifax hack. Equifax took 5 months to discover and patch that vulnerability. The hackers compromised the initial server 3 months after disclosure, which is the average patch cycle for organizations.
According to Gartner, the average time it takes to discover and exploit a known vulnerability is now 15 days (down from 45).
Every organization is vulnerable once you consider all your applications and hosts across the enterprise and the long cycle to detect, resolve and patch before the vulnerability is closed. It is absolutely necessary to sweep your environment for software vulnerabilities that have slipped through the patch cycles.
3. Consequences = Value of your digital assets or firm being compromised.
If your firm is embracing technology in your value chain, the value of these assets can approach the value of the entire firm. The cost of downtime or an information breach rises every day as you expand your digital presence with your customers, partners, and employees. The primary costs of a breach are: Financial damage, Reputation damage, Non-compliance, Privacy violation
Here are some industry standard numbers to help you estimate your risks based on the average cost of a breach.
- At least 1 in 3 business experience a significant security incident each year.
- For the mid market, the average costs of a breach is $400 to $700 per endpoint or $141 per customer record.
- For larger enterprises, Ponemon estimates the cost is $4M, up 10% from last year.
But, even these statistics averages assume you have a solid cybersecurity foundation on prevention and compliance basics. The damage for the unprepared and underinvested can be massive. The damage for the City of Atlanta ransomware recovery ballooned from $3M to $17M with some core business processes like customer billing still not able to fully recover.
The largest driver for increased costs of a breach are the Mean Time to Detect and Contain (MTTD and MTTC) or Dwell Time. Which continues to average almost 6 months which gives the attackers time to significantly amplify the damage they inflict.
Aberdeen analyzed data from over 2000 breaches and found that controlling the Dwell Time to 7 days reduces 77% of this risk. The Ponemon 2018 analysis of 500 breaches also concluded that a firm’s Incident Response capability is the largest driver for reducing risk.
Netting all this out….
Given the above, we start with the assumption that every defense and every organization can and has been compromised. No set of defenses or protection solutions can block 100% of all your threats. This is why you must have a Threat Detection capability that can sweep behind your protection tools and identify threats, assets, applications… inside your environment that shouldn’t be there.
Providing this capability is the basis of Infocyte. We believe every organization must have the ability to run an effective process to proactively Detect and Respond to incidents for a reasonable cost. The DETECTION and RESPONSE solution should also be independent from the PROTECTION solutions.