In recent months, there’s been much talk about the idea of companies competing on privacy. In theory, this sounds great. Consumers can make choices based on their privacy preferences, and the marketplace will respond. In practice, there are some significant challenges. The NSTIC pilots are learning about these challenges first hand.
The NSTIC calls for the Identity Ecosystem to be privacy-enhancing and voluntary and provides some high-level considerations around these concepts. The pilots are expected to develop identity solutions that adhere to these concepts. But how do they move from high-level considerations to actual implementation? Moreover, how do they achieve an implementation that demonstrates effective privacy protections in consistent and repeatable ways?
In cybersecurity, for example, there are tools such as risk models, control catalogs and technical standards that provide consistent and repeatable results. If an NSTIC pilot wants to securely transmit an attribute, its engineers don’t sit down at their computers and start coding from scratch. There are existing protocols they can select that have been widely evaluated and that demonstrate effective attribute transmission. But what if a pilot wants to collect user consent for the transmission of that attribute? What standard exists for user consent?
The privacy field lags behind other fields such as cybersecurity and safety risk management in providing the types of models and tools that support measurable and consistent outcomes. It is much more difficult for consumers to make informed choices if organizations are marketing their privacy practices with different or, even worse, no measures of effectiveness.
To address this gap, NIST has launched a new privacy engineering effort that focuses on providing design guidance to information system users, owners, developers and designers that handle personal information. Such guidance can be used to decrease risks related to privacy harms and to make purposeful decisions about resource allocation and effective implementation of controls. In April, NIST held the
first of a series of workshops. Based on this first workshop, NIST has proposed a
set of privacy engineering objectives and a risk model to mitigate privacy harms to individuals. NIST is co-sponsoring a
second workshop with the International Association of Privacy Professionals (IAPP) to discuss these proposals and inform the development of a NIST report on privacy engineering. This free workshop will be held in San Jose, California, on September 15-16, 2014.
In the story of the Tower of Babel, God was concerned that a people who spoke one language could take over the world. He prevented this by causing people to speak many languages. I’m no theologian, so I won’t theorize on the merits of God’s actions, but the story does illustrate the power of unity. In privacy, we need to begin speaking with a consistent terminology and using models and tools that provide us with the capability to better measure the effectiveness of privacy design in information systems.
There are many good privacy efforts underway today – but the way to make them BETTER and enable true competition is for experts in various disciplines to collaborate on identifying and adapting measurement capabilities that have worked in other areas. We encourage system designers, engineers and privacy subject matter experts to participate in the next NIST privacy engineering workshop or provide feedback to NIST at
privacyeng [at] nist.gov (privacyeng[at]nist[dot]gov). Together, we can develop the foundational components that will enable the
Identity Ecosystem Steering Group to achieve the full vision of the NSTIC Identity Ecosystem; one that is secure and privacy-enhancing.