One Control Catalog to Rule Them All

In my daily work as CISO, the team spends considerable time, effort and expense to ready ourselves for audits. Normally this means moving huge amounts of data around to validate that the security plans we have developed are functioning as designed and are protecting our systems.

It can be complicated because depending on the complexity of the organization, there are usually many, many systems that all have slightly different requirements, timelines, features, staff, etc. This means that the effort of an audit can be multiplied by such complexity.

From an outside view, it’s interesting to see the external forces on this dynamic. The government developed the Cyber Security Maturity Model Certification (CMMC) to ensure that the government itself and its subcontractors were all compliant with a minimum set of these audit controls. Within the Defense Industrial Base (DIB) this has caused a great deal of consternation because these mandates can be difficult and expensive to comply with. At the same time, it is this type of regulation that can be a major driver toward a more secure foundation for the industry as a whole. CMMC has historically also been a challenging problem because of it’s all or nothing approach, meaning that failing an audit can represent sudden jeopardy for contractors. While the government and industry work together to sort through how this will work out, the force of impact on systems is undeniable.

At the same time, we see NIST on work for new standards and capabilities like OSCAL, RMF and Continuous Monitoring that offer potential architectural and technical capabilities to bring about the next generation of performance and process improvement to standardize how we achieve compliance and moreover, a secure baseline in which we can operate.

So why all of this preamble. We all already know this right?

It occurs to me that sometimes the picture isn’t as clear as it can be, so I wanted to offer a little perspective from the inside of a compliance program.

There are a few things we need to get into here. I want to list the standards up front so we’re on the same page. Here is what I’m interested in:

  • NIST SP800-53r5 Security and Privacy Controls Catalog
  • NIST SP800-53A Assessment Guide
  • NIST SP800-53B Controls Baseline Tailoring Guide
  • NIST SP800-171r2/r3 Security and Privacy Controls Catalog
  • NIST SP800-172 Enhanced Controls for Controlled Unclassified Information
  • CMMC v2 Cybersecurity Maturity Model Certification
  • NIST Cybersecurity Framework (CSF)
  • NIST Risk Management Framework (RMF)
  • DISA Security Technical Implementation Guides (STIG)
  • DoD Overlays (e.g. Safeguarding of NNPI)

Now if you’re in the field, one of those things doesn’t look like the others. Rest assured, I’m only discussing public information here, but the concepts are relevant on a broad basis.

When you look at this list you could easily add any number of additional requirements like PCI-DSS, HIPAA, FedRAMP, SOX, ISO 27001, GDPR, CCPA (need I go on). But the point should be clear that there are a number of overlapping standards.

The standard approach for managing this is to do a ‘crosswalk’ which allows you to map these standards to each other. Meaning that if you want to be compliant in two different standards, some of the controls overlap and you can simplify if you state that the control you’re implementing satisfies standard A and B simultaneously.

NIST started approaching this problem some time ago by developing what they refer to as baseline tailoring. In the case of NIST SP800-53, this means that the 53 ‘catalog’ is a giant book of possible things you could do, but the 53B ‘baseline tailoring guide’ helps you determine which ones you actually need for a specific purpose.

Today we see nods to this approach in other standards as they crosswalk to 53 from their own standards. For example, if you want to implement 171, it will directly tell you which 53 controls satisfy the 171 requirement. The same is true for ISO 27001, HIPAA, and FedRAMP, most STIGs and others. The 53 catalog is central to most standards in this regard.

We anticipate that NIST SP800-171r3 will actually more centrally endorse this strategy by simply removing the direct statement of controls and their familiar crosswalk and simply acting as a baseline tailoring overlay on 53 directly, the same way that 53B specifies a ‘low’, ‘moderate’ or ‘high’ baseline today.

And this brings us back to CMMC. As version 2 winds its way toward production, it increasingly looks like CMMC v2 Level 2 maps very close to SP800-171 and CMMC v2 Level 3 may map to NIST SP800-172. Where there is additional value in the specificity of the standard is on how it is implemented through practices and guidance.

In the same way, 53A provides audit guidance for 53 so that assessors have directed evidence to look for when assessing if a 53 control has been correctly implemented.

This complexity is a little further complicated as the controls themselves are not always as specific as is necessary to apply them. For example, NIST SP800-53r5 IA-5 Authenticator Management specifies many of the properties of passwords. But the way these properties are applied to different software inherently changes how the control is implemented and audited. To borrow a software concept, if IA-5 is a base class of control, implementing it on a Windows 11 operating system is a sub-class or potentially an instance of the control where the specifics are not specified by the base class. This sub-class differs from how the control would be implemented on a Windows Server and again different that on a Windows Domain either because the core technology is different or because the scope of the use of the control has changed.

In the audit story I mentioned in the opening, this would result in the control being meticulously documented in a system security package (SSP) for the authorizing official (AO) to approve of before an authority to operate (ATO) can be granted.

To keep that in context the scope of a system security plan can include multiple computers, each with their own software stacks. Choosing a minimal standard, like the unclassified commercial requirement of NIST SP800-171, there are 110 controls which could potentially be applied. However, any of these controls could be applied multiple times, depending on the complexity of the computer. So if your ‘system’ had 20 computers in it and each had on average 60 controls, you are looking for 1200 control statements in your SSP. This is simplified when things like operating system controls are applied the same way for each instance of the operating system, but the audit of the system could collect any or all such instances.

As you might imagine, the overhead here is significant. Mentioned earlier, NIST OSCAL (Open Source Control Automation Language) can help here. In it’s earlier life, SCAP (Security Content Automation Protocol) developed CCEs (Common Control Enumeration) to help specify how settings are applied to specific targets. For example, if you want to control password length, you can look in this registry setting on a Windows 11 operating system to determine the current value. These controls were later referenced in SCAP OVAL (Open Vulnerability and Assessment Language) and SCAP XCCDF (Extensible Configuration Checklist Description Format). Which brings us back to OSCAL. While the initial development in OSCAL has been to specify the controls which need to be applied to a specific targeted computer, the underlying control catalog which is referenced must develop an automated way of assessing that catalog.

Almost a year ago I wrote an article about this called ‘The future of SCAP is the missing gap in OSCAL’. In that article I reference the OSCAL Component Model. The layered approach of OSCAL makes it easy to specify the controls that need to be applied to a system, but the component model references how those controls will be automatically assessed. It is this automatic assessment that makes OSCAL a game changer.

But underlying that automation framework is the OSCAL catalog and profile model and it is this layer that is primarily interesting to me in this posting. For OSCAL to be effective, it must specify not just the concept of the SP800-53 control (for example) that must be applied, but the control variables that define how the control will be applied so that it can further be evaluated.

Ahead of doing this within the constraints of an automated model, it is worth doing the architectural or paper exercise of normalizing the control catalog and working to confine systems to as few ‘variants’ of that catalog as is possible.

In this case, I’m recommending a simple three dimensional table. The first axis is simple, it is all of the base class controls. I like SP800-53r5 here, but there are others as well. The second axis is the crosswalk component that does the baseline tailoring. Examples of this might be Unclassified Commercial (SP800-171), Unclassified Controlled Information Commercial (SP800-172), Classified Secret (SP800-53 Moderate) and Classified Top Secret (SP800-53 High). (These are approximated for example – your requirements will vary). This second axis simply states which SP800-53 controls are required by the baseline. The third axis is the ‘class’ of component that implements the control. Per the prior example, this could be something like a ‘Managed Windows 11 Desktop’. In each of the environments, this creates a requirement of the component class requiring specific controls based on the environment it is in.

The last addition to this model is the sub-variant of the control which implements any component specific variants for the implementation of the control. In this way you might have an IA-5 control variant for Windows 11 Desktop that would vary from how the control is implemented in a Windows 2022 Server. Each control variant would be a subclass of the original control.

This 3D model gives you a complete control catalog, a list of environments (regulatory overlays) and a set of component classes that can exist. With such a catalog, your system security plan becomes a more minimized inventory of components mapped to the device class and a derived listing of controls from that mapping.

So this is a lot of overhead. Really, its not lost on me that this control catalog is a big undertaking. However, there are a few different ways to look at the benefits of this.

  • This reduces the OSCAL Control Model to a single base class of NIST SP800-53 and cross walks the control baseline tailoring to the OSCAL Profile Model. (Already done – see GitHub)
  • This reduces the OSCAL Component Model to Control Implementations with corresponding assessment criteria (e.g. the CCE model)
  • This reduces the OSCAL SSP model to an system component inventory mapping to the OSCAL Component Model catalog.
  • This allows for the OSCAL POA&M Assessment methodology to automate the collection and assessment of evidence.

So while we get ready for our next audit, I’m thinking. How many SSPs do I have today and how customized are they? What are the architectural patterns and engineering component models that I can begin to implement so that these systems become self documenting and the SSP is a lens to look at the continual compliance framework rather than a large manual lift.

Hopefully we can get the ship steered in the right direction to align our systems thinking before the OSCAL tools arrive and demand an architectural model that we are not yet ready for, but can see.

Leave a Reply