Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Making Privacy Concrete (Three Words Not Usually Found Together)

Most in the IT space won’t know this, but NIST has one of the world’s best concrete engineering programs. Maybe we just have concrete on the mind since a couple of us in the office are doing house renovations, but with today’s publication of the NIST Internal Report 8062, An Introduction to Privacy Engineering and Risk Management in Federal Systems (NISTIR 8062), we are taking a page from the concrete folks’ book with a document that we believe hardens the way we treat privacy, moving us one step closer to making privacy more science than art. NISTIR 8062 introduces the concept of applying systems engineering practices to privacy and provides a new model for conducting privacy risk assessments on federal systems.

There were several reasons for venturing into this territory. Certainly the Office of Management and Budget’s July 2016 update to Circular A-130 gave us a strong impetus, but our ongoing trusted identities pilot program was also a significant earlier driver. The pilots need to demonstrate their alignment with the NSTIC Guiding Principles, but in the first couple of years of the program, grant recipients often had difficulty expressing to us how their solutions aligned with the Privacy Guiding Principle. Even agreeing about the kinds of privacy risks that were of greatest concern in federated identity solutions could drag out over multiple rounds of discussion.

NIST has produced a wealth of guidance on information security risk management (the foundation of which is NIST’s Risk Management Framework), but there is no comparable body of work for privacy. While there are international privacy framework standards that include the need for identifying privacy risk, there are no widely accepted models for doing the actual assessment.

We learned from stakeholders that part of the problem is the absence of a universal vocabulary for talking about the privacy outcomes that organizations want to see in their systems. In information security, organizations understand that they are trying to avoid losses of confidentiality, integrity and availability in their systems. The privacy field has the Fair Information Practice Principles, but as high-level principles they aren’t written in terms that system engineers can easily understand and apply. Oftentimes, privacy policy teams must make ad hoc translations to implement them in specific systems.

To try to bridge this communication gap and produce processes that are repeatable and could lead to measurable results, we began by considering how privacy and information security are related and how they are distinct. The Venn diagram below illustrates how information security operates in the space of unauthorized behavior within the system, whereas privacy can be better described as dealing with the aspects of system processing of personally identifiable information (PII) that is permissible, or authorized. The two fields overlap around security of PII.

security privacy venn diagram

We also reflected on whether having privacy engineering objectives that had some functional equivalency to confidentiality, integrity, and availability could help bridge the gap between privacy principles and their implementation in systems. Here’s what we came up with.

privacy engineering objectives

Lastly, we developed, and confirmed with stakeholders, a privacy risk model to use in conducting privacy risk assessments. We needed a frame of reference for analysis—a clear outcome—that organizations could understand and identify. In information security, the risk model is based on the likelihood that a system vulnerability could be exploited by a threat, and the impact if that occurs. What is the adverse event though when systems are processing data about people in an authorized manner - meaning any life cycle action the system takes with data from collection through disposal? We know that people can experience a variety of problems as a result of data processing such as psychologically-based problems like embarrassment or more quantifiable problems like identity theft. We think that if organizations could focus on identifying whether there was a likelihood that any given action the system was taking with data could create a problem for individuals, and what the impact would be, this would give them a clearer frame of reference for analyzing their systems and addressing any concerns they discovered.

How did this work out for our pilots? Frankly, it exceeded our expectations. Using this privacy risk model, they could identify new privacy risks, prioritize the risks, communicate them to senior management, and implement controls as appropriate (usually some combination of policy-based and technical controls). Shoutout to the pilots—we greatly appreciate your insights!

NISTIR 8062 is only an introduction to privacy engineering and risk management concepts. In the coming months and years, we will continue our engagement with stakeholders to refine these ideas and develop guidance on how to apply them. One of the properties of concrete that makes it so useful is that you can mold it into just about any shape, but once it sets you know exactly what to expect of its performance. This sort of flexible but consistent performance has long eluded those who care about systems-implementable privacy protections.

About the author

Mike Garcia

Mike Garcia is a PhD economist and Federal 100 award winning cybersecurity expert. He currently serves as lead for the Trusted Identities Group at the National Institute of Standards and Technology (NIST), working to catalyze commercial and government adoption of innovative online identity solutions and advancing standards, guidance, and measurement science in identity management.

Mike has focused on cyber economics at NIST since 2011 and was previously with the Department of Homeland Security. He has also worked as a market research manager and software engineer. His dissertation analyzed the conditions that induce firms to invest in preventing data breaches and to report them when they happen.

Naomi Lefkovitz

Naomi Lefkovitz is the Senior Privacy Policy Advisor in the Information Technology Lab at the National Institute of Standards and Technology, U.S. Department of Commerce. Her portfolio includes work on the National Strategy for Trusted Identities in Cyberspace (NSTIC), privacy engineering, privacy-enhancing technologies, cybersecurity and standards development.

FierceGovernmentIT named Ms. Lefkovitz on their 2013 “Fierce15” list of the most forward-thinking people working within government information technology, and she is a 2014 Federal 100 Awards winner.

Before joining NIST, she was the Director for Privacy and Civil Liberties in the Cybersecurity Directorate of the National Security Staff in the Executive Office of the President. Her portfolio included the NSTIC as well as addressing the privacy and civil liberties impact of the Obama Administration’s cybersecurity initiatives and programs.

Prior to her tenure at the White House, Ms. Lefkovitz was a senior attorney with the Division of Privacy and Identity Protection at the Federal Trade Commission. Her responsibilities focused primarily on policy matters, including legislation, rulemakings, and business and consumer education in the areas of identity theft, data security and privacy.

At the outset of her career, she was Assistant General Counsel at CDnow, Inc., an early online music retailer.

Ms. Lefkovitz holds a B.A. with honors in French Literature from Bryn Mawr College and a J.D. with honors from Temple University School of Law.

Suzanne Lightman

Suzanne Lightman has over a decade of experience in information security policy in positions all over the government, as well as in the private sector.  She has held positions in both the legislative and executive branches which gives her a unique perspective on the development and implementation of government policy. Currently, Ms. Lightman is a Senior Advisor at the Computer Security Division of the Information Technology Lab at the National Institute of Standards and Technology (NIST). In that position, she is involved with a diverse portfolio of topics including development of the Cybersecurity Framework required under Executive Order 13636, cybersecurity in cyber-physical systems, identity management, and cybersecurity policy. She is also one of the team developing the Privacy Risk Management Framework at NIST.

Ellen Nadeau

Ellen Nadeau is part of the Privacy Engineering Program at the National Institute of Standards and Technology (NIST), where she works to develop and pilot privacy risk management guidance and tools for organizations across sectors. She specializes in privacy-enhancing identity management solutions. Ellen received her Master’s of Public Administration from New York University, where she was a Scholar for Service at the NYU Center for Interdisciplinary Studies in Security and Privacy. Previously, Ellen worked at a digital rights nonprofit (Derechos Digitales) in Santiago, Chile, as a Google Policy Fellow, and with the National Center for Missing & Exploited Children in the Netsmartz Workshop.

Related posts

Comments

Way to concrete the foundation
I applaud this development relating to privacy engineering! I view it as a concrete extension of Privacy by Design, that not only complements PbD but provides solid measures that may be taken by engineers and systems designers. Great work! Ann Cavoukian
Much welcomed addition. Hopefully this effort will adhere to the KISS principle and acknowledge that not all PII present the same risks. Privacy could in fact suffer if controls were too rigid and not commensurate to the risks.
The premise to this report and other polices emerging from OMB show the emerging importance that Privacy and Security practitioners have a mutual responsibility to work together in designing, altering, or integrating systems containing PII. Now if we can get the local and state governments on-board these projects can have a positive psychological effect on every citizen.
It is so good to see NIST bring Privacy out of the closet. I promoted the hints of Privacy in NIST 800-53, but always needed to enhance with a Privacy Framework, Privacy Impact Assessment, and Privacy Risk Management. It is great to see NIST bring these things into the NIST Privacy/Security as a distinct, yet related. Well done.
Was the Privacy Risk Assessment Methodology (PRAM) and the PRAM forms (in the original appendix D) removed from the latest version of 8062 for a specific reason? Will there be a separate work product discussing PRAM?
We removed the PRAM from the final version of NISTIR 8062 in an effort to streamline the document and clarify that it is an introductory document rather than guidance. However, we do plan to post publicly the PRAM documents in the near future. In the meantime, if you’d like to leverage these worksheets, please email us at privacyeng [at] nist.gov (privacyeng[at]nist[dot]gov) and we’ll send them to you directly.

Add new comment

CAPTCHA
Image CAPTCHA
Enter the characters shown in the image.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Please be respectful when posting comments. We will post all comments without editing as long as they are appropriate for a public, family friendly website, are on topic and do not contain profanity, personal attacks, misleading or false information/accusations or promote specific commercial products, services or organizations. Comments that violate our comment policy or include links to non-government organizations/web pages will not be posted.