My recent online reading and posting have indicated quite clearly that the topic of quantifying cyber risk is a big deal.  Earlier this month, I shared my colleague Ron Brash’s latest podcast where he talks about a lack of data to make informed decisions and you responded significantly!  It seems that many are struggling with separating fact from fiction which, to me, is quite obvious but also introduces a host of related angles to examine when trying to understand just what state we are truly in when it comes to ICS cyber security.

Specifically, I don’t think most CIO/CISOs have ever really calculated what it costs to secure OT environments.  I have seen a rise in Cyber Security Insurance recently which requires enough ‘data’ about past behavior as a fundamental requirement to begin to build actuarial tables.  But the real interesting thing here is the debate about what you need to secure in an OT environment.

Securing OT environments is not a new topic.  I recall once interviewing for a new job and one of the company’s most ardent security evangelists looked at me and said, “You better not think OT security is about windows and IT type devices”.  (I didn’t take that job, in case you were wondering).  But that focus on embedded or ‘control’ equipment does have its supporters too.  Like this industry stalwart and his view on ‘Ground Up Security’ which posits that controllers are the most vital link in the operational chain.  The opposite view was summed up quite well (among other salient security points) but FireEye in their recent blog about how they approach cyber security.  The article has a host of topics and ideas, but I wanted to draw your attention to a very specific one, namely the Theory of 99.

Then this week, multiple people referred to this FireEye article which introduced the notion of the Theory of 99.  Specifically, they state (from empirical evidence – hint, hint – foreshadowing!) the following:

In intrusions that go deep enough to impact OT:

  • 99% of compromised systems will be computer workstations and servers
  • 99% of malware will be designed for computer workstations and servers
  • 99% of forensics will be performed on computer workstations and servers
  • 99% of detection opportunities will be for activity connected to computer workstations and servers
  • 99% of intrusion dwell time happens in commercial off-the-shelf (COTS) computer equipment before any Purdue level 0-1 devices are impacted

So the answer to bottom up or top down (or protect the windows devices vs. protect the embedded devices) is to protect both.  You might think that answer is too easy and avoids the question, but regardless of your opinion:

Embedded devices or OS – it still begs the question of how many assets or risks, and how much protection and remediation is required?

The simple answer is that we need to decide based on facts, not conjecture, opinion or hyperbole.  How do we do that?  Imagine that we could have an empirical discussion based on real time, end point parameters (yes, a robust, multi-context inventory).  And for this, I don’t mean a ‘simple inventory’ which consists of IP addresses, MAC addresses and a few inferred or intercepted characteristics.  I mean a robust, 360 degree view of the asset.  This means combining asset details (OS, open ports, installed software, users, logins error logs etc.) with ‘tribal’ knowledge or meta data (such as system criticality, owner, location, redundancy, etc.) and third party reference (NVD, backup, whitelisting or AV status).

These multiple aspects of an asset allow an operator to make an informed decision in risk.  Understanding how many of your assets have a critical risk (yes, embedded assets as well as OS-based or ‘IT type’ assets alike), but then filtering out those that are not mission critical, are in heavily segmented subnets or that have compensating controls applied (like least privilege) allows you to prioritize and even (dare I suggest is) do nothing and accept contextual, low impact risk.

This type of insight would negate the top down or bottom up approach to security risk. It would provide empirical evidence to actuarial tables.  It graphically and scientifically outlines true, operational risk.  If you want to quantify risk, you must first start with a quantifiable inventory.  Only then can you have informed discussions instead of heated debates.

Subscribe to stay in the loop

Subscribe now to receive the latest OT cyber security expertise, trends and best practices to protect your industrial systems.