NIST logo

Tech Beat - April 14, 2015

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: June 23, 2010
Date Modified: April 20, 2015 

NIST Tightens the Bounds on the Quantum Information 'Speed Limit'

If you’re designing a new computer, you want it to solve problems as fast as possible. Just how fast is possible is an open question when it comes to quantum computers, but physicists at the National Institute of Standards and Technology (NIST) have narrowed the theoretical limits for where that “speed limit” is. The work implies that quantum processors will work more slowly than some research has suggested.* 

Quantum graph
The size of a quantum computer affects how quickly information can be distributed throughout it. The relation was thought to be logarithmic (blue). Progressively larger systems would need only a little more time. New findings suggest instead a power law relationship (red), meaning that the "speed limit" for quantum information transfer is far slower than previously believed. Credit:Irvine/NIST

The work offers a better description of how quickly information can travel within a system built of quantum particles such as a group of individual atoms. Engineers will need to know this to build quantum computers, which will have vastly different designs and be able to solve certain problems much more easily than the computers of today. While the new finding does not give an exact speed for how fast information will be able to travel in these as-yet-unbuilt computers—a longstanding question—it does place a far tighter constraint on where this speed limit could be.

Quantum computers will store data in a particle’s quantum states—one of which is its spin, the property that confers magnetism. A quantum processor could suspend many particles in space in close proximity, and computing would involve moving data from particle to particle. Just as one magnet affects another, the spin of one particle influences its neighbor’s, making quantum data transfer possible, but a big question is just how fast this influence can work.

The NIST team’s findings advance a line of research that stretches back to the 1970s, when scientists discovered a limit on how quickly information could travel if a suspended particle only could communicate directly with its next-door neighbors. Since then, technology advanced to the point where scientists could investigate whether a particle might directly influence others that are more distant, a potential advantage. By 2005, theoretical studies incorporating this idea had increased the speed limit dramatically.

“Those results implied a quantum computer might be able to operate really fast, much faster than anyone had thought possible,” says NIST’s Michael Foss-Feig. “But over the next decade, no one saw any evidence that the information could actually travel that quickly.”

Physicists exploring this aspect of the quantum world often line up several particles and watch how fast changing the spin of the first particle affects the one farthest down the line—a bit like standing up a row of dominoes and knocking the first one down to see how fast the chain reaction takes. The team looked at years of others’ research and, because the dominoes never seemed to fall as fast as the 2005 prediction suggested, they developed a new mathematical proof that reveals a much tighter limit on how fast quantum information can propagate.“

The tighter a constraint we have, the better, because it means we’ll have more realistic expectations of what quantum computers can do,” says Foss-Feig. 

The limit, their proof indicates, is far closer to the speed limits suggested by the 1970s result. 

The proof addresses the rate at which entanglement propagates across quantum systems. Entanglement—the weird linkage of quantum information between two distant particles—is important, because the more quickly particles grow entangled with one another, the faster they can share data. The 2005 results indicated that even if the interaction strength decays quickly with distance, as a system grows, the time needed for entanglement to propagate through it grows only logarithmically with its size, implying that a system could get entangled very quickly. The team’s work, however, shows that propagation time grows as a power of its size, meaning that while quantum computers may be able to solve problems that ordinary computers find devilishly complex, their processors will not be speed demons.

“On the other hand, the findings tell us something important about how entanglement works,” says Foss-Feig. “They could help us understand how to model quantum systems more efficiently.”

*M. Foss-Feig, Z.-X. Gong, C.W. Clark and A.V. Gorshkov. Nearly linear light cones in long-range interacting quantum systems. Phys. Rev. Lett.,. 114, 157201 (2015) – Published online April, 13, 2015.

Media Contact: Chad Boutin,, 301-975-4261

back to top

Report Charts a Research Path for Improving 'Printed' Metal Parts

Additive manufacturing has been called a game changer. But new games require new instructions, and the manuals for a growing assortment of methods for building parts and products layer-by-layer—collectively known as "3D printing"—still are works in progress.

A high-power laser spot scans back and forth over a layer of cobalt-chrome powder on NIST's powder bed fusion additive manufacturing machine. Where it touches, the powder melts and fuses to underlying layers until a fully dense 3D metal part is formed. The laser travels back and forth so fast that, in this still image frame, it appears to form a white hot stripe about 10 millimeters wide.
Credit: Lane/NIST
View hi-resolution image

Manufacturing researchers at the National Institute of Standards and Technology (NIST) have scoped out the missing sections in current guidelines for powder bed fusion, thec hief method for "printing" metal parts. The new NIST report* identifies key unknowns that must be solved before the technique—one of the most promising and versatile of additive manufacturing processes—can progress from largely a "trial-and-error" method to one that can be fine-tuned automatically.

The report presents an integrated view that systematically links process inputs to in-process phenomena that might either be measured or modeled and to the ultimate determinants of part quality such as material characteristics, dimensional accuracy and surface roughness.

One of seven categories of additive manufacturing processes (as defined by an ASTM standard), powder bed fusion usually employs a laser to selectively heat, melt and fuse the thin top layer of metal particles on a bed of powder. Once a layer is completed, more powder is spread on top and the process is repeated until accumulated layers comprise the designed part. Products ranging from medical implants to fuel nozzles for jets already are made with the process.

And aerospace and automotive manufacturers, in particular, would like to add significantly to the technique's repertoire of metal-part-making capabilities. But that will require advances in process control and reliability.

Powder bed fusion, like other additive manufacturing approaches, offers several advantages over conventional manufacturing methods, which often entail removing, or machining, portions from a blank of starting material and usually require joining several machined pieces to build a functional part or product. Although the build process can be slow for metal parts, additive manufacturing is easily customized to make parts with far more complex shapes and features, enabling innovation in design. Parts can be designed to be significantly lighter with the same functionality, and scraps of leftover material are minimal.

However powder bed fusion of metal parts is beset by system performance and reliability issues that can undermine part quality, problems shared by other additive manufacturing methods. Examples are dimensional and form errors, unwanted voids in the fused layers, high residual stress in the final parts, and poorly understood material properties such as hardness and strength.

Robust process control through in-process sensing and real-time control can prevent or correct these problems, but achieving this solution requires detailed understanding of all of the many intricacies of powder bed fusion, according to the NIST researchers. In their review of previous research, they cite one study claiming that more than 50 factors influence the melting process alone.

On the basis of their survey of previous research, including NIST studies, the NIST team distilled a detailed breakout consisting of 12categories of "process parameters," 15 types of "process signatures," and six categories of "product qualities." They then chart the cause-and-effect relationships among variables in each of the three categories.

"This cause-and-effect breakout can guide research to develop measurement and sensing capabilities as well as modeling and simulation tools—all with the aim of enabling better process control," says NIST mechanical engineer Brandon Lane, one of the report's authors.

In the next phase of the research, a NIST research team will build an additive manufacturing test bed for evaluating in-process measurement and control methods. This tool, the report says, will enable the researchers to "observe melting and solidification of metal powders, integrate process metrology tools, and implement software interfaces and data acquisition for process measurements, as well as test control algorithms."

* M. Mani, B. Lane, A. Donmez, S. Feng, S. Moylan and R. Fesperman. Measurement Science Needs for Real-timeControl of Additive Manufacturing Powder
Bed Fusion Processes
(NISTIR 8036), Feb. 2015.

Media Contact: Mark Bello,, 301-975-3776

back to top

NIST Develops NMR ‘Fingerprinting’ for Monoclonal Antibodies

National Institute of Standards and Technology (NIST) researchers at the Institute for Bioscience and Biotechnology Research (IBBR) have demonstrated the most precise method yet to measure the structural configuration of monoclonal antibodies (mAbs), an important factor in determining the safety and efficacy of these biomolecules as medicines. Monoclonal antibodies are proteins manufactured in the laboratory that can target specific disease cells or antigens (proteins that trigger an immune reaction) for removal from the body. The method described in a recent paper* may soon help manufacturers and regulators better assess and compare the performance and quality of mAbs.

schematic showing NIST mAb
A schematic showing the NISTmAb monoclonal antibody, an immunoglobulin G (IgG) molecule being developed by NIST as a reference material. The labels mark the fragments Fab and Fc that were used in the novel NIST two-dimensional NMR fingerprinting method to measure the structural configuration of the entire antibody. Credit:NIST
View hi-resolution image

The IBBR is a joint institute of NIST and the University of Maryland.

Monoclonal antibodies can be used as extremely specific therapeutic agents, including ones designed to target cancer cells unique to an individual. However, in order to properly function as a biotherapeutic agent, the molecule’s structural units—amino acids—must fold into a three-dimensional structure that aligns its active regions with corresponding receptor sites on a target cell or antigen. If misfolding occurs, a potent and safe treatment may become ineffective, or worse, provoke a dangerous or fatal immune reaction. High-resolution spectral analysis—imaging at the atomic level where even the bonds between hydrogen and carbon atoms are distinguishable—is required to precisely define the mAb’s structure and determine if the protein is folding properly. 

“We refer to this as ‘measuring fingerprints,’ because just as a person has a unique set of fingerprint patterns, each mAb has a one-of-a-kind spectral makeup,” says NIST research chemist Robert Brinson. “If we can map that spectral fingerprint, we can determine whether or not folding is occurring as desired.” 

To do this, the IBBR team turned to a solution that would surprise most biopharmaceutical experts: two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy. NMR is a technique that measures the atomic signature of a molecule similar to how doctors use magnetic resonance imaging (MRI) to noninvasively view organs. “To date, it’s been assumed that 2D NMR could not be practically applied to monoclonal antibodies because it’s too insensitive, too time intensive and too expensive for analyzing anything other than much smaller drug molecules,” Brinson explains.

In pushing the boundaries of the technique, the IBBR team used an NMR system with a high magnetic field strength to produce the first 2D NMR map of a complete, drug-like mAb.** The map was generated using signals from methyl groups.

“Methyl groups are dispersed throughout the mAb structure and, in particular, in the folded cores of the molecule that we want to evaluate,” Brinson says. “We can use their signals to yield a specific spectral fingerprint that reflects the unique structure of the mAb.”

To make the 2D NMR method more accessible to the lower-strength magnetic field instruments found in most analytical research labs, the IBBR team narrowed the analysis by dividing its sample antibody into two structural fragments.“We mapped the 2D NMR signals generated by the subset of methyl groups found in these fragments, both about a third of the size of the entire protein,” Brinson says. “The sum of the data gained from this analysis was found to be a good proxy for the spectral fingerprint of the full mAb.” 

The new 2D NMR fingerprinting method also overcomes the problems of cost and time. “We reduced the time needed for our measurements from many hours to about 30 minutes,” Brinson says.

Brinson says that he and his colleagues are now working on a statistical method that will allow users of their 2D NMR methodology to compare fingerprints from multiple protein samples. “With that ability, manufacturers will be able to quantitatively show that spectra obtained from different lots of the same drug product are identical, enabling them to better meet regulatory requirements for quality and performance,” he says.

* L.W. Abrogast, R.G. Brinson and J.P. Marino. Mapping monoclonal antibody structure by 2D 13C NMR at natural abundance. Analytical Chemistry, 87: 3556-3561 (2015). DOI: 10.1021/ac504804m
** The monoclonal antibody used in this experiment is NISTmAb, an immunoglobulin G type 1 donated by MedImmune and being developed by NIST as a reference material.

Media Contact: Michael E. Newman,, 301-975-3025

back to top

NIST Makes 'Bio Inspired' Flame Retardants in a Jiffy

After devising several new and promising "green" flame retardants for furniture padding, National Institute of Standards and Technology (NIST) researchers took a trip to the grocery store and cooked up their best fire-resistant coatings yet. As important, these protective coatings can be made in one straightforward step.

As reported in a new article,* the NIST team prepared nine water-based mixtures made up of various combinations of potato starch, seaweed gel (agar), laundry booster, clay and similar everyday compounds. In laboratory tests, six of these "bioinspired" coatings reduced the peak heat release rate—a key measure of flammability—of polyurethane foam by at least 63 percent, compared with untreated foam.

Encouraged by the lab results, the team subjected the top-performing mixture—starch and a boron-containing compound used in deodorant and other products—to a full-scale fire test. That entails igniting the seat cushions of entire chairs padded with treated or untreated polyurethane foam.

The untreated chair, upholstered with a synthetic fabric, was completely engulfed in flames 90 seconds after ignition and was completely consumed in less than two minutes.

In contrast, the fire in the chair treated with the NIST-devised coating remained confined to the cushion 90 seconds after ignition, even though the fabric covering had burned completely. The researchers recorded a 71 percent drop in the total amount of heat released, so that combustion could not be sustained and the flames did not spread.

Furniture fires are the leading cause of casualties in house fires. In 2013, they accounted for about 30 percent of more than 2,700 deaths in residential fires, according to the National Fire Protection Association. Fires that start in furniture, such as those lit by a burning cigarette or a nearby heater, account for the largest share, but furniture also serves as a major fuel source for fires that originate elsewhere.

"The results of the full-scale fire tests are very encouraging," says NIST team leader Rick Davis. "The performance of our coating suggests that fire can be contained to burning furniture so that it does not spread, intensify to the point of flashover, and increase the risk of injury or death."

Earlier candidates for "green" flame retardants formulated by the NIST group were made with a "layer by layer" deposition process that required repeating a series of steps to create stacks of coating layers. The newest coatings were crafted with what the researchers call a "one-pot" process: add ingredients to water, heat, stir until the solution turns to a gel, and then cool. Depending on the ingredients, preparation times ranged from about 30 minutes to two hours.

To achieve uniform coverage, foam is immersed in the solution for two minutes.

The uncomplicated process could lend itself to industry adoption. However, additional research is necessary to determine the durability of the new coatings and to assess other properties affecting performance and manufacturing applications.

In addition to furniture, chemical flame retardants are used in a variety of other consumer products. Several have been banned, and some others have been linked to human health risks and environmental problems. NIST's bio-inspired, experimental coatings are contributing to the search for alternatives.

Fire tests demonstrate the flame-retarding benefits of a new "bio-inspired" coatings developed by NIST researchers. One chair (top) is padded with untreated foam; padding in the other chair is coated with the experimental flame retardant. After ignition (0:24), the upholstery fabric on both chairs is consumed in flames (2:30). Six minutes after ignition, the untreated cushion burned completely, leaving a melt pool that continues to burn. Flames on the treated padding release less heat and remain confined to the cushion. Unable to sustain combustion, flames on the treated cushion are nearly extinguished after nine minutes.
Credit: NIST
View hi-resolution image

Full clips of these NIST burn tests may be viewed here:
Untreated foam cushions (top) | Starch, Boron, MMT PP-PE Fabric Chair

* R. Davis, YC. Li, M. Gervasio, J. Luu, and Y.S. Kim, One-pot, bio-inspired coatings to reduce the flammability of flexible polyurethane foams. ACS Applied Materials &Interfaces, DOI: 10.1021/acsami.5b01105. Access at:

Media Contact: Mark Bello,, 301-975-2763

back to top

NIST to Launch Review of Resilience Planning Guide at Houston Workshop, April 27, 2015

The National Institute of Standards and Technology (NIST)will issue the full draft of its Community Resilience Planning Guide for Buildings and Infrastructure for public comment during a workshop at Texas Southern University in Houston, Texas, on April 27. This event, which is open to the public, will serve as the springboard for discussion of the draft guide during the 60-day review-and-comment period.

St. John's Regional Medical Center in Joplin, Missouri.
Tornado destruction, Joplin Mo.: a collapsed building once housing the backup generator for a hospital.
Credit: NIST
View hi-resolution image

The draft planning guide was developed by NIST researchers and outside experts in disciplines ranging from buildings to public utilities and from earthquake engineering to sociology. Input from stakeholders was gathered during a series of regional meetings and through submissions directly to the NIST Community Resilience Group.

The guide provides an integrated perspective on improving the resilience of buildings and infrastructure systems in the context of sustaining community functions such as health care, education and other important local services. It is designed to help communities develop and implement plans that will help to prevent and reduce devastation from natural and human-caused hazards and prolonged recoveries in their aftermath.

Workshop participants will be introduced to the planning guide and discuss how it can be applied in communities. Government representatives from four communities will share their plans for implementing the framework in their localities. NIST will discuss next steps, including formation of a Disaster Resilience Standards Panel. This panel of stakeholders will further develop the guide and provide more specific implementation guidance to enhance community resilience.

To register and to learn more about this event, please go to:

Media Contact: Mark Bello,, 301-975-3776

back to top

Carroll Thomas Named Director of the Hollings Manufacturing Extension Partnership

 Carroll Thomas
Carroll Thomas/Credit:NIST

Carroll Thomas has been appointed director of the Hollings Manufacturing Extension Partnership (MEP) at the National Institute of Standards and Technology (NIST). In partnership with local nonprofits, the MEP provides support to small and medium-size U.S. manufacturers through a network of centers in every state and Puerto Rico.

“Carroll brings relevant experience from the federal, private and nonprofit sectors, as well as nearly 12 years at MEP,” said NIST Acting Director and Acting Under Secretary of Commerce for Standards and Technology Willie May. “She understands the value of collaboration and creating strong networks that bring together varied resources to achieve a common goal.”

Thomas comes to NIST from the Small Business Administration (SBA), where she served as associate administrator of the Office of Small Business Development Centers (SBDC). The SBDC is analogous to the MEP program in that its $114 million in federal funds support at least half of the funding provided to 63 host organizations with expert business assistance counselors in service centers in all 50 states and five U.S. territories. During her tenure at SBA, Thomas established innovative grant policies, procedures and processes to oversee $20 million in Disaster Recovery Act funding for small business affected by Superstorm Sandy. She also was the principal SBA senior executive in a major partnership with the U.S. State Department Western Hemisphere Office and initiated an innovative pilot with the Mexican government to work with SBA cluster organizations and Small Business Technology Development Centers.

MEP’s network includes more than 440 service locations with more than 1,200 staff serving as business advisors and technical experts to help manufacturers increase profits, create and maintain jobs and establish a foundation for long-term growth and productivity.

“Carroll was selected through a rigorous competitive process,” said NIST Associate Director for Innovation and Industry Services Phil Singerman. “I am confident she will be a great addition to the MEP program.”

Prior to assuming her position at the SBA, Thomas worked at NIST MEP from January 2000 to November 2012. During her tenure at NIST MEP, Thomas was a member of the Marketing and Communications Office responsible for numerous branding initiatives. She also worked in the Program Development Office and led major programs, including starting the Interagency Network of Enterprise Assistance Providers and many of MEP’s technology acceleration activities.

In addition, Thomas held special assignments in the NIST Program Office, the Office of the Secretary of Defense and the Department of Commerce’s Budget and Policy and Strategic Planning offices. Thomas has received numerous leadership and performance awards, including the Commerce Gold and Silver medals, and three NIST George A. Uriano awards for outstanding achievement in building or strengthening NIST extramural programs.

Prior to joining federal service, Thomas worked in the private and nonprofit sectors. She holds a bachelor of science degree from Drexel University and an MBA from the Johns Hopkins Carey School of Business.

Media Contact: Jennifer Huergo,, 301-975-6343

back to top

NIST Requests Comments for Revising Its Electronic Authentication Guideline

The National Institute of Standards and Technology (NIST) is requesting public comment on a possible update of its 2012 Electronic Authentication Guideline.

Credit: © ymgerman/

Electronic authentication verifies the identity of a user when they log in to an information system, ensuring that the remote user is who they claim to be. The identity established during authentication can be pseudonymous—that is, the true identity of the person is unknown, but the fact of the right to access is established.

Many online interactions demand a high level of confidence in authentication, so the methods that go beyond the familiar username/password combination are imperative for the future.

"Given innovations in the marketplace and the increase of online federal services, including, we think it is appropriate to consider an update of NIST's Electronic Authentication Guideline," says NIST senior advisor Paul Grassi. "In addition, as the Identity Ecosystem envisioned by the National Strategy for Trusted Identities in Cyberspace (NSTIC) continues to evolve, NIST guidelines should reflect and support it."

As the first step in revising the publication, NIST is soliciting recommendations from experts (including those in industry, government, and educational fields) on which sections of the document need to be revised. In addition to overall technology changes, the revision is driven by three recent developments in the federal government.

  • The October 2014 Executive Order 13681, Improving the Security of Consumer Financial Transactions, requires "…that all agencies making personal data accessible to citizens through digital applications require the use of multiple factors of authentication and an effective identity proofing process, as appropriate."
  • The roadmap accompanying the February 2014 Framework for Improving Critical Infrastructure Cybersecurity acknowledges that updates to NIST special publications may be considered to support improved authentication practices. The framework was published in response to Executive Order 13636, Improving Critical Infrastructure Cybersecurity.
  • NSTIC charts a course for public and private-sector collaboration to raise the level of trust of identities involved in online transactions through an Identity Ecosystem. NSTIC calls for the federal government to "lead by example and implement the Identity Ecosystem for the services it provides internally and externally." In addition, as the NSTIC pilots continue to gather critical lessons learned, it is imperative to consider applying these outcomes to the Electronic Authentication Guideline.

Like the original version, the revised guideline will supplement the Office of Management and Budget's E-Authentication Guidance for Federal Agencies.

The current version of NIST's Electronic Authentication Guideline is available at The Note to Reviewers is available at Please send your questions and comments by May 22, 2015, to

Edited on April 15, 2015, to correct deadline date for comments.

Media Contact: Evelyn Brown,, 301-975-5661

back to top

Securing a Public Safety Broadband Network with Identity Management

Th First Responder Network Authority (FirstNet), part of the National Telecommunications and Information Administration (NTIA), is building a nationwide public safety broadband network to provide first responders with access to 21st century communication technology that improves safety and security. But how do you make sure the network only helps the good guys? The National Institute of Standards and Technology (NIST) has published an analysis of how a public safety broadband network can be secured so that only approved first responders and public safety personnel can access it.

The planned broadband network will offer emergency and law enforcement personnel many advantages, including audio, video and real-time information sharing via broadband cellular technologies to mobile devices. In the past, for example, firefighters from different jurisdictions teaming up to fight a major fire have had communications problems because their systems use different radio frequencies or encryption technologies. A nationwide wireless broadband network would provide a mechanism not only to integrate those systems, but also to provide layers of advanced support. It could, for example, integrate a live video feed from a robot sent into a fire to search for people or let chiefs located at different points around a fire see a live feed on their mobile devices. This stands to improve situational awareness and enable incident commanders make more informed decisions.

But such a network also comes with challenges. Not just ensuring that only authorized people can use it, but also ensuring that sensitive information, such as a criminal record, is shared only with sanctioned people that have a “need to know.” 

NIST’s analysis draws on its expertise in identity management for mobile devices. Considerations for Identity Management in Public Safety Networks provides background information on identity management and a review of applicable federal and industry guidance for using next generation networks with a number of options for policy makers to consider. Topics include selecting identity credentials and the authentication process. It also includes analysis of possible identity management technologies that could be used. 

The NIST analysis on securing first responder communications may be applicable to local public safety networks, private sector communities and public safety applications that leverage identity management services, including criminal justice information and records management systems. Considerations for Identity Management in Public Safety Networks (NISTIR 8014) is available at DOI:

Media Contact: Evelyn Brown,, 301-975-5661

back to top

New Report: Online Identity Projects are Creating More Options for Businesses, Individuals

Organizations and individuals have more options for securely and conveniently confirming identity online, thanks to a grant program that has been funding pilot projects in support of the National Strategy for Trusted Identities in Cyberspace (NSTIC). A new report on the pilots by the NSTIC program office at the National Institute of Standards and Technology (NIST) says that the projects also are revealing and overcoming barriers to a marketplace for obtaining online credentials that can be used instead of passwords for shopping, banking and other interactions.

The benefits and challenges to online authentication discovered so far in the NSTIC program are laid out in NSTIC Pilots: Catalyzing the Identity Ecosystem. Since 2012, NIST has awarded approximately $30 million to fund 15 pilot projects in the health care, financial, education, retail, aerospace and government sectors, among others.

"We've designed our grant program to focus on complete solutions, not just technologies," said Mike Garcia, deputy director of the NSTIC National Program Office. " As important as it is to make available new technologies, the pilots are also helping to foster new infrastructure and policy mechanisms that will make areal difference in the way we approach online identity as a whole."

The strategy calls for the private sector to lead development of an "identity ecosystem" where individuals and organizations can choose from a variety of credentials to prove who people are online. The pilots are, in effect, seeding the marketplace with identity solutions that are aligned with the strategy to enhance privacy, security, interoperability and convenience.

Each set of pilot grantees seems to be building on the successes and lessons learned from the previous round of awardees, according to the report. The program itself is evolving from addressing broad challenges to overcoming more specific gaps in the market.

The lessons learned from the pilots also are informing other online authentication efforts; in particular, the work of the Identity Ecosystem Steering Group (IDESG), which received grant funding from NIST. Members of the pilot teams have been actively engaged in the IDESG and are helping to advance the organization's work by informing and testing materials it has produced.

The report provides specific examples of the impact of the pilot projects. For example, 2012 grantee the American Association of Motor Vehicle Administrators' Cross-Sector Digital Identity Initiative exposed the need for legislation to address online identity and informed Virginia's recently passed Electronic Identity Management Act.'s pilot project, which enables nearly one million military, first responders, teachers and students to verify their affiliations to receive discounts and more, has increased the number of companies relying on its service by 167 percent. Using, sports clothing and accessories maker Under Armour increased revenue from the military and first responder market by more than 30 percent.

This year, NIST plans to fund a fourth and fifth round of pilot projects. The NSTIC National Program Office is reviewing abbreviated applications for an opportunity announced in February, 2015. Through May 28,2015, it will accept submissions for a new funding opportunity focused on privacy-enhancing technologies announced in March. NIST will hold a webinar on April 15, 2015, to offer general guidance on preparing applications and to answer questions about NSTIC and the grants process.

The full report, NSTIC Pilots: Catalyzing the Identity Ecosystem (NISTUIR 8054) can be found at

Media Contact: Jennifer Huergo,, 301-975-6343

back to top

NIST Calls for Final Comments on Guide to Protecting Sensitive Information on Nonfederal Systems

The National Institute of Standards and Technology (NIST) is requesting comments on the second and final draft of a guidance document for federal agencies on protecting the confidentiality of sensitive federal information when such information resides in nonfederal information systems and organizations. This draft contains significant changes from the original draft, which was issued in November 2014.*

Executive Order 13556 established the Controlled Unclassified Information (CUI) Program to standardize the way the executive branch handles unclassified information that requires protection, and designated the National Archives and Records Administration (NARA) to implement that program.

As part of this implementation, NARA is seeking to develop a standardized, government-wide approach for protection of CUI when nonfederal organizations are in possession of this information. Nonfederal organizations include, for example, contractors, state and local governments, and colleges and universities.

The protection of CUI is critical to the national and economic security interests of the United States. The CUI Registry, managed by NARA, contains an extensive list of CUI categories and subcategories that are the exclusive designations for information throughout the executive branch requiring controls based on law, regulations or government-wide policies. Some examples of CUI Registry categories are critical infrastructure, emergency management, financial, intelligence, law enforcement, patent and privacy.

NIST and NARA joined forces in 2014 to write Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations.** The publication provides federal agencies with guidance on how to protect the confidentiality of CUI consistent with law, regulation or government-wide policy. It is meant for federal employees with responsibilities for information systems development, acquisition, management and protection.

The changes in the final public draft are based on comments received from both the public and private sectors. In particular, the final draft:

  • Clarifies the publication's purpose, scope and applicability, and defines underlying assumptions and expectations in applying the recommended CUI security requirements;
  • Explains how the publication relates to the CUI federal rule and the planned Federal Acquisition Regulation clause that NARA will sponsor next year;
  • Adjusts the CUI security requirements to ensure complete coverage and traceability to federal policies, standards and guidance;
  • Provides tables that map CUI security requirements to security controls in NIST Special Publication (SP) 800-53 and the IS0/IEC 27001 standard that are the basis for many computer security programs;
  • Provides additional tables that illustrate how the moderate security control baseline in the federal government's foundational computer security document, NIST SP 800-53, was tailored to handle CUI in nonfederal systems and organizations; and
  • Adds guidance on use of the mapping tables to support those nonfederal organizations that are implementing the NIST Framework for Improving Critical Infrastructure Cybersecurity.

Comments on the final public draft of Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations should be sent to by May 12, 2015. The publication is available at:

* See the November 2014 NIST Tech Beat story, "Filling the Gap: NIST Document to Protect Federal Information in Nonfederal
Information Systems

** R. Ross, P. Viscuso, G. Guissanie, K. Dempsey and M.Riddle. Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations. (NIST Final Public Draft Special Publication 800-171),April, 2015.

Media Contact: Evelyn Brown,, 301-975-5661

back to top

NIST Request Comments on Its Big Data Framework Development

As part of a major collaborative effort to develop a standard framework to make it easier for scientists to use "Big Data" sets in their work, the National Institute of Standards and Technology (NIST) is publishing for public comment a draft NIST Big Data Interoperability Framework.

Credit: ©apops/

The seven-volume publication of Big Data foundational documents will serve as the U.S. input to the international standards community.

Big Data is the term used to describe the deluge of data in our networked, digitized, sensor-laden, information-driven world. Big Data collections are measured in trillions of bytes (terabytes) and thousands of trillions of bytes (petabytes). The data range from text, images and audio collected from social media to the output from physics experiments that deliver data points 40 million times a second. New technology is evolving to harness the rapid growth of data.

There is a broad agreement among commercial, academic and government leaders that effective use of Big Data has the potential to spark innovation, fuel commerce and drive progress. The availability of vast data resources carries the potential to answer questions previously out of reach. Questions such as: How do we reliably detect a potential pandemic early enough to intervene? Can we predict the properties of new materials even before they've been synthesized? How can we reverse the current advantage of the attacker over the defender in guarding against cybersecurity threats?

However, there is also broad agreement on the ability of Big Data to overwhelm traditional approaches. The rate at which data volumes, speeds and complexity are growing is outpacing scientific and technological advances in data analytics, management, transport and more.

A lack of consensus on some important, fundamental questions about Big Data is confusing potential users and holding back progress. What are the attributes that define Big Data solutions? How is Big Data different from the traditional data environments and related applications that we have encountered thus far? How do Big Data systems integrate into our current data systems? What are the central scientific, technological and standardization challenges that need to be addressed to accelerate the deployment of robust Big Data solutions?

"One of NIST's Big Data goals was to develop a reference architecture that is vendor-neutral, and technology- and infrastructure-agnostic to enable data scientists to perform analytics processing for their given data sources without worrying about the underlying computing environment," said NIST's Digital Data Advisor Wo Chang.

To do this, NIST formed the NIST Big Data Public Working Group (NBD-PWG) with members from industry, academia and government from around the world. The working group has developed consensus definitions, taxonomies, key requirements for data security and privacy protections, a proposed reference architecture and a standard roadmap.

The findings of the NBD-PWG to date make up the NIST Big Data Interoperability Framework:

  • Volume 1: Definitions
  • Volume 2: Taxonomies
  • Volume 3: Use Cases and General Requirements
  • Volume 4: Security and Privacy
  • Volume 5: Architectures White Paper Survey
  • Volume 6: Reference Architecture
  • Volume 7: Standards Roadmap.

The NIST Big Data Interoperability Framework may be found on the NIST Big Data Public Working Group page at The deadline for comments is May 21, 2015. Please send them to

Media Contact: Evelyn Brown,, 301-975-5661

back to top

NIST Invites Comments on Challenges in Protecting Consumer Data

The National Institute of Standards and Technology (NIST) invites the public to comment on a report from the Feb. 12, 2015, Executive Technical Workshop on Improving Cybersecurity and Consumer Privacy. The workshop, a collaboration with Stanford University, brought together chief technology officers, information officers and security executives to discuss the challenges their organizations and industrial sectors face in implementing advanced cybersecurity and privacy technologies.

The focus of the meeting was on organizations such as those in the retail, hospitality or health care industries that deal directly with consumers and transmit or store data from clients, customers or patients. The draft report, Executive Technical Workshop in Improving Cybersecurity and Consumer Privacy (NIST IR 8050), summarizes the participants' key points and suggests areas for future cybersecurity efforts led by NIST.

"We'd like to hear from workshop participants and those who couldn't be there to help us develop and prioritize future NIST cybersecurity projects," said Donna Dodson, chief cybersecurity advisor for NIST. "Feedback such as this helps us ensure that we focus our efforts on projects that would be of the most value to industry and consumers."

Workshop participants represented a wide variety of sectors and organizations in a range of sizes. Despite their organizational differences, they agreed on many broad points, including:

  • Both organizations and consumers are responsible for safeguarding digital assets, and organizations can help consumers realize stronger cybersecurity protections through education, training and privacy policies that are more transparent and clear—and security measures that are easier for consumers to use.
  • People who develop software and applications need security tools that are easier to include in their products.
  • Cybersecurity products and services must be easier for security technologists to use.
  • The entire cybersecurity community—including government, industry and academia—needs to work together to address large issues. Participants asked NIST to act as a convener for these efforts.

The National Cybersecurity Center of Excellence (NCCoE) played a large role in organizing the meeting, with participation from NIST staff who work on many aspects of cybersecurity including the Framework for Improving Critical Infrastructure Cybersecurity, the National Initiative for Cybersecurity Education (NICE) and the National Strategy for Trusted Identities in Cyberspace (NSTIC). All of these groups would participate in projects that might result from the February meeting, particularly the NCCoE, which concentrates on developing practical, real-world cybersecurity approaches that can be rapidly implemented.

NIST will host a public meeting from 1 to 3 p.m. Pacific time on April 21, 2015, at 835 Market Street, San Francisco. The site is on the downtown campus of San Francisco State University and walking distance from theUSA 2015 RSA conference being held April 20-24 at the Moscone Center. A second meeting will be scheduled for early summer 2015 to receive feedback and further refine the scope of the projects listed in the report.

The summary document can be found on the NCCoE website. The comment period will run through May 17, 2015. Comments can be submitted via a form at or to, and will be made publicly available after review.

Media Contact: Jennifer Huergo,, 301-975-6343

back to top

Baldrige Program Tops in Leadership Development for Two Straight Years

For the second year in a row, the Baldrige Performance Excellence Program (BPEP) has received the first place award in the government and military category of the Leadership 500 Excellence Awards, an annual recognition of the world's best leadership development programs and initiatives. The honor was announced on March 31, 2015—World Leadership Day—during a dinner at the LEAD 2015 conference in Dallas, Texas.

Rounding out the top three awards in the government and military category were two Texas organizations: the City of Houston Learning and Development Center – The Center for Excellence and the Texas Comptroller of Public Accounts.

Given out by Leadership Excellence Essentials magazine for more than 30 years, the Leadership 500 Excellence Awards rank the top leadership development programs in each of the following organizational types: small, midsize, large, government/military, nonprofits, international, educational, small partners/providers, midsize partners/providers, large partners/providers and international partners/providers. The judges use applications, survey responses and interviews to evaluate each candidate's programs, ranking them on seven criteria: vision/mission, involvement/participation, accountability/measurement, design/content/curriculum, presenters/presentations/delivery, take-home value and outreach.

Since 2010, BPEP has placed four times in the top 10 of the Leadership 500 Excellence Award's government/military rankings. 

The Baldrige Program raises awareness about the importance of performance excellence in driving the U.S. and global economy; provides organizational assessment tools and criteria; educates leaders in businesses, schools, health care organizations, and government and nonprofit organizations about the practices of national role models; and recognizes those role models with the Baldrige Award.

The complete list of the 2015 Leadership 500 Excellence winners is available on the award’s website. For more about the Baldrige Performance Excellence Program, call (301) 975-2036 or email

Media Contact: Michael E. Newman,, 301-975-3025

back to top