Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Making Privacy Concrete (Three Words Not Usually Found Together)

Most in the IT space won’t know this, but NIST has one of the world’s best concrete engineering programs. Maybe we just have concrete on the mind since a couple of us in the office are doing house renovations, but with today’s publication of the NIST Internal Report 8062, An Introduction to Privacy Engineering and Risk Management in Federal Systems (NISTIR 8062), we are taking a page from the concrete folks’ book with a document that we believe hardens the way we treat privacy, moving us one step closer to making privacy more science than art. NISTIR 8062 introduces the concept of applying systems engineering practices to privacy and provides a new model for conducting privacy risk assessments on federal systems.

There were several reasons for venturing into this territory. Certainly the Office of Management and Budget’s July 2016 update to Circular A-130 gave us a strong impetus, but our ongoing trusted identities pilot program was also a significant earlier driver. The pilots need to demonstrate their alignment with the NSTIC Guiding Principles, but in the first couple of years of the program, grant recipients often had difficulty expressing to us how their solutions aligned with the Privacy Guiding Principle. Even agreeing about the kinds of privacy risks that were of greatest concern in federated identity solutions could drag out over multiple rounds of discussion.

NIST has produced a wealth of guidance on information security risk management (the foundation of which is NIST’s Risk Management Framework), but there is no comparable body of work for privacy. While there are international privacy framework standards that include the need for identifying privacy risk, there are no widely accepted models for doing the actual assessment.

We learned from stakeholders that part of the problem is the absence of a universal vocabulary for talking about the privacy outcomes that organizations want to see in their systems. In information security, organizations understand that they are trying to avoid losses of confidentiality, integrity and availability in their systems. The privacy field has the Fair Information Practice Principles, but as high-level principles they aren’t written in terms that system engineers can easily understand and apply. Oftentimes, privacy policy teams must make ad hoc translations to implement them in specific systems.

To try to bridge this communication gap and produce processes that are repeatable and could lead to measurable results, we began by considering how privacy and information security are related and how they are distinct. The Venn diagram below illustrates how information security operates in the space of unauthorized behavior within the system, whereas privacy can be better described as dealing with the aspects of system processing of personally identifiable information (PII) that is permissible, or authorized. The two fields overlap around security of PII.

security privacy venn diagram

We also reflected on whether having privacy engineering objectives that had some functional equivalency to confidentiality, integrity, and availability could help bridge the gap between privacy principles and their implementation in systems. Here’s what we came up with.

privacy engineering objectives

Lastly, we developed, and confirmed with stakeholders, a privacy risk model to use in conducting privacy risk assessments. We needed a frame of reference for analysis—a clear outcome—that organizations could understand and identify. In information security, the risk model is based on the likelihood that a system vulnerability could be exploited by a threat, and the impact if that occurs. What is the adverse event though when systems are processing data about people in an authorized manner - meaning any life cycle action the system takes with data from collection through disposal? We know that people can experience a variety of problems as a result of data processing such as psychologically-based problems like embarrassment or more quantifiable problems like identity theft. We think that if organizations could focus on identifying whether there was a likelihood that any given action the system was taking with data could create a problem for individuals, and what the impact would be, this would give them a clearer frame of reference for analyzing their systems and addressing any concerns they discovered.

How did this work out for our pilots? Frankly, it exceeded our expectations. Using this privacy risk model, they could identify new privacy risks, prioritize the risks, communicate them to senior management, and implement controls as appropriate (usually some combination of policy-based and technical controls). Shoutout to the pilots—we greatly appreciate your insights!

NISTIR 8062 is only an introduction to privacy engineering and risk management concepts. In the coming months and years, we will continue our engagement with stakeholders to refine these ideas and develop guidance on how to apply them. One of the properties of concrete that makes it so useful is that you can mold it into just about any shape, but once it sets you know exactly what to expect of its performance. This sort of flexible but consistent performance has long eluded those who care about systems-implementable privacy protections.

About the author

Mike Garcia

Mike Garcia is a PhD economist and Federal 100 award winning cybersecurity expert. He currently serves as lead for the Trusted Identities Group at the National Institute of Standards and Technology...

Naomi Lefkovitz

Naomi Lefkovitz is the Senior Privacy Policy Advisor in the Information Technology Lab at the National Institute of Standards and Technology, U.S. Department of Commerce. Her portfolio includes work...

Suzanne Lightman

Suzanne Lightman has over a decade of experience in information security policy in positions all over the government, as well as in the private sector.  She has held positions in both the legislative...

Ellen Nadeau

Ellen Nadeau is part of the Privacy Engineering Program at the National Institute of Standards and Technology (NIST), where she works to develop and pilot privacy risk management guidance and tools...

Related posts

Let’s talk about IoT device security

NIST’s Cybersecurity for the Internet of Things (IoT) Program is beginning stakeholder engagement on identifying a core set of cybersecurity capabilities

Comments

Way to concrete the foundation
I applaud this development relating to privacy engineering! I view it as a concrete extension of Privacy by Design, that not only complements PbD but provides solid measures that may be taken by engineers and systems designers. Great work! Ann Cavoukian
Much welcomed addition. Hopefully this effort will adhere to the KISS principle and acknowledge that not all PII present the same risks. Privacy could in fact suffer if controls were too rigid and not commensurate to the risks.
The premise to this report and other polices emerging from OMB show the emerging importance that Privacy and Security practitioners have a mutual responsibility to work together in designing, altering, or integrating systems containing PII. Now if we can get the local and state governments on-board these projects can have a positive psychological effect on every citizen.
It is so good to see NIST bring Privacy out of the closet. I promoted the hints of Privacy in NIST 800-53, but always needed to enhance with a Privacy Framework, Privacy Impact Assessment, and Privacy Risk Management. It is great to see NIST bring these things into the NIST Privacy/Security as a distinct, yet related. Well done.
Was the Privacy Risk Assessment Methodology (PRAM) and the PRAM forms (in the original appendix D) removed from the latest version of 8062 for a specific reason? Will there be a separate work product discussing PRAM?
We removed the PRAM from the final version of NISTIR 8062 in an effort to streamline the document and clarify that it is an introductory document rather than guidance. However, we do plan to post publicly the PRAM documents in the near future. In the meantime, if you’d like to leverage these worksheets, please email us at privacyeng@nist.gov and we’ll send them to you directly.

Add new comment

  • This question is for testing whether or not you are a human visitor and to prevent automated spam submissions. Image CAPTCHA
    Enter the characters shown in the image.
Please be respectful when posting comments. We will post all comments without editing as long as they are appropriate for a public, family friendly website, are on topic and do not contain profanity, personal attacks, misleading or false information/accusations or promote specific commercial products, services or organizations. Posts that violate our comment policy will not be posted.