In This Issue...
NIST Announces New Center for Materials Research to Advance Manufacturing and Innovation
The National Institute of Standards and Technology (NIST) announced today that it has selected a consortium led by Northwestern University to establish a new NIST-sponsored center of excellence for advanced materials research. The new Center for Hierarchical Materials Design (CHiMaD) will be funded in part by a $25 million award from NIST over five years.
Other members of the CHiMaD consortium include the University of Chicago, the Northwestern-Argonne Institute of Science and Engineering (a partnership between Northwestern and the Department of Energy’s Argonne National Laboratory) and the Computation Institute (a partnership between the University of Chicago and Argonne.) The consortium also plans to work closely with QuesTek Innovations, a small business spin-off of Northwestern; ASM International, a well-known professional society of materials scientists; and Fayetteville State University.
“I’m particularly excited to announce this new alliance between NIST and two prominent research universities to drive innovation in the development of advanced materials,” said Patrick Gallagher, Under Secretary of Commerce for Standards and Technology and NIST Director. “This new Center for Hierarchical Materials Design is a natural fit for NIST, which has a long tradition of serving as a nexus with academia and industry to advance research and innovation for the nation’s benefit.”
“The launch of this new center represents a major milestone in support of the President’s Materials Genome Initiative and our national goal of doubling the pace of discovery and development of novel materials,” said Cyrus Wadia, assistant director for Clean Energy and Materials R&D at the White House Office of Science and Technology Policy. “By integrating the complementary strengths of computation, instrumentation, and creative modeling, this center promises to help keep America at the forefront of the materials revolution and a leader in the economically important domain of advanced manufacturing.”
The new center will focus on developing the next generation of computational tools, databases and experimental techniques to enable “Materials by Design*,” one of the primary goals of the administration’s Materials Genome Initiative (MGI). “Materials by design” employs physical theory, advanced computer models, vast materials properties databases and complex computations to accelerate the design of a new material with specific properties for a particular application—perhaps an extremely tough, lightweight composite for auto bodies or a biocompatible cell scaffold for medicine. It stands in contrast to the traditional trial-and-error method of materials discovery (think of Thomas Edison and his dogged quest for the best lightbulb filament.)
Materials-by-design techniques have the potential to revolutionize the development of new advanced materials, which in turn have created whole industries. It’s estimated that the average time from laboratory discovery of a new material to its first commercial use can take up to 20 years. The MGI aims to halve that.
The new center’s work is expected to encompass both “hard” (inorganic) and “soft” (organic) advanced materials in fields as diverse as self-assembled biomaterials, smart materials for self-assembled circuit designs, organic photovoltaic materials, advanced ceramics and metal alloys.
CHiMaD will focus these techniques on a particularly difficult challenge, the discovery of novel “hierarchical materials.” Hierarchical materials exploit distinct structural details at various scales from the atomic on up to achieve special, enhanced properties. An example in nature of a hierarchical material is bone, a composite of mineral and protein at the molecular level assembled into microscopic fibrils that in turn are assembled into hollow fibers and on up to the highly complex material that is “bone.”
The award to the Northwestern consortium for the Center for Hierarchical Materials Design is for $5 million per year for 5 years, subject to available funds. NIST may, at its discretion, extend the award for an additional 5 years after a performance review. The Northwestern-led consortium is contributing another approximately $4.65 million to the center.
For more details:
*“Materials by Design" is a registered trademark of QuesTek Innovations LLC.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
NIST Tornado Reports Urge New Standards for Saving Lives, Property
Nationally accepted standards for building design and construction, public shelters and emergency communications can significantly reduce deaths and the steep economic costs of property damage caused by tornadoes. That is the key conclusion of a 2-year technical investigation by the National Institute of Standards and Technology (NIST) into the impacts of the May 22, 2011, tornado that struck Joplin, Mo.
Recommendations for achieving these standards are featured in a draft report* issued for public comment on Nov. 21, 2013, and are strongly supported by a second NIST report** released today that documents impacts observed following the May 20, 2013, tornado in the Newcastle-Moore area of Oklahoma.
The NIST Joplin study was the first to scientifically assess the impact of a tornado in four major categories: tornado characteristics, building performance, human behavior and emergency communication—and the impact of each on life-safety, the ability to protect people from injury or death. It also is the first to recommend that standards and model codes be developed and adopted for designing buildings to better resist tornadoes.
In the majority of buildings studied in Joplin, the research team found that regardless of construction type, the structures did not adequately protect occupants and that Joplin residents had limited access to underground or tornado-resistant shelters. In addition, multiple factors contributed to a delayed or incomplete response by Joplin residents in the tornado's path, including lack of awareness of the tornado's approach, confusion about or distrust of the emergency messages prior to the tornado's arrival, and an inability to perceive risk due to the conflicting information.
The Oklahoma study found that the storm in Newcastle and Moore inflicted an extensive level of damage and destruction to buildings and designated safe areas on par with that seen in Joplin. In both cases, areas designated as safe areas did not adequately protect occupants and essential facilities (such as hospitals) did not remain operational. The dramatically similar findings, the researchers say, demonstrate the need for nationally accepted standards against tornadoes.
The massive tornado in Joplin was rated by the National Oceanic and Atmospheric Administration (NOAA)'s National Weather Service (NWS) as category EF 5, the most powerful on the Enhanced Fujita scale. The multiple-vortex storm destroyed some 8,000 structures in its path and killed 161 people. It was the single deadliest tornado in the United States in the 63 years that official records have been kept.
The Newcastle-Moore, Okla., storm also was rated EF 5. It damaged or destroyed nearly 2,400 structures and killed 24, including seven children who died when the elementary school classroom building in which they took shelter collapsed.
Based on findings from the Joplin investigation, NIST developed 16 recommendations for improving how buildings and shelters are designed, constructed and maintained in tornado-prone regions; and for improving the emergency communications that warn of imminent threat from tornadoes. All of these recommendations are backed by what the team observed in Oklahoma.
The key recommendation proposed in the Joplin report is "the development and adoption of nationally accepted performance-based standards for the tornado-resistant design of buildings and infrastructure to ensure the resiliency of communities to tornado hazards." This includes a call for designing and constructing essential buildings—such as hospitals and emergency operations centers—and infrastructure to remain operational in the event of a tornado.
Following the public comment period on the draft Joplin study, NIST will issue a final report and then work with the appropriate code development organizations to use the study's recommendations to improve model building codes and lay the foundation for nationally accepted standards. NIST also will work with organizations representing state and local governments—including building officials—to encourage them to seriously consider implementing its recommendations.
Comments on the draft Joplin report and recommendations must be received by 5 p.m. Eastern Time on Monday, Jan. 6, 2014. Comments may be submitted via email to email@example.com or mailed to NIST Technical Investigation Joplin, 100 Bureau Dr., Stop 8611, Gaithersburg, Md. 20899-8611.
Full details on the Joplin tornado study and the NIST recommendations are available from the news announcement, "NIST Investigation of Joplin, Mo., Tornado Details Proposed Measures for Saving Lives and Property."
*NIST NCSTAR 3 (Draft for Public Comment), Technical Investigation of the May 22, 2011, Tornado in Joplin, Missouri, and
**NIST SP 1164, Preliminary Reconnaissance of the May 20, 2013, Newcastle-Moore Tornado in Oklahoma, are both available at www.nist.gov/el/disasterstudies.
Media Contact: Michael E. Newman, firstname.lastname@example.org, 301-975-3025
NIST Readies Tests for DARPA Robotics Challenge in December
‘Twill be nearly the night before Christmas, but at Florida’s Homestead-Miami Speedway many a robotic creature will be stirring, while visions of a $2 million prize and international prestige dance in the heads of the machines’ creators.
This Dec. 20-21, research teams from around the world will be competing in the trials of the DARPA Robotics Challenge—a mock-up of a disaster scenario prompted by Japan’s Fukushima Daiichi nuclear meltdown, caused by the 2011 Great East Japan earthquake and tsunami. Teams will be directing their emergency-response robots to perform eight basic tasks that were drawn from the Fukushima Daiichi response and then converted into standardized tests by researchers at the National Institute of Standards and Technology (NIST).
A year later, the capabilities of robots that qualify in this year’s trials will be tested in a more realistic disaster scenario. In the winner-take-all finals, robots will perform all eight challenges consecutively.
The goal of the novel competition, according to the Defense Advanced Research Projects Agency, or DARPA, is to spur “cost-effective” hardware and software innovations that will enable future robots to perform the most hazardous activities during or in the aftermath of a disaster.
Early on, DARPA engaged NIST to help it craft its disaster-response requirements for robots and distill them into tests that the Defense Department agency can use to measure and compare the capabilities of competitors.*
“The DARPA Robotics Challenge is a great learning opportunity for the robotics community and a chance for NIST to demonstrate how standard performance tests help to inspire and guide innovation while measuring progress in a diverse, fast-moving area of technology,” says engineer Adam Jacoff, leader of the NIST testing program.
With support from the Department of Homeland Security, NIST engineers pioneered the use of standardized performance testing for emergency response robots used in bomb-response and for urban search-and-rescue operations. Since 2005, 15 NIST tests have been adopted as standards by ASTM International, and about 40 more are under various stages of development or review.
To date, more than 100 response robots, both experimental and commercial, have run the gauntlet of NIST test methods at Response Robot Evaluation Exercises and in support of robot procurements. Over the last few years, the suite of performance tests has been duplicated at sites around the United States and in Germany, Japan, and soon, Australia.
In the first two tasks during the December, 2013, trials, robot contestants will drive a utility vehicle through a slalom course, dismount, and traverse increasingly complex obstacles. Other tasks include removing debris from an entry, opening several doors, climbing a ladder, locating and closing valves, connecting a hose, and using tools to cut a hole through a wall. All tasks consist of three sub-tasks, with points awarded for each completed within a 30-minute time limit.
With collaborator Southwest Research Institute, NIST engineers are finalizing details on all sets of tasks. And instead of wrapping Christmas presents, they soon will be packing up the simulated disaster scenario for shipment to Florida. All tests are designed to be stacked on—of course—standard pallets and arranged for easy reassembly at the competition site.
*A document describing the eight NIST-designed tasks can be accessed from the DARPA Robotics Challenge home page: www.theroboticschallenge.org/.
Media Contact: Mark Bello, email@example.com, 301-975-3776
NIST Calibration Tools to Encourage Use of Novel Medical Imaging Technique
The National Institute of Standards and Technology (NIST) has developed prototype calibration tools for an experimental medical imaging technique that offers new advantages in diagnosing and monitoring of certain cancers and possibly other medical conditions.*
NIST designed, constructed and tested two prototype phantoms for calibrating ultralow-field (ULF) magnetic resonance imaging (MRI) systems. Phantoms are widely used tools for quality control in medical imaging. They are generally objects with simple shapes but very well-defined responses to a specific type of imaging scanner. As their name implies, phantoms are stand-ins for the body, and are used to help optimize MRI machines to deliver the best possible medical images for a given type of tissue.
The NIST prototypes are the first standard calibration tools for ULF-MRI, offering a quantitative means to assess performance, validate the technique, and directly compare different experimental and clinical MRI scanners.
"Tissues that may look the same in clinical MRI can look very different in ULF-MRI, which provides new contrast mechanisms," NIST physicist Michael Boss says. "Our hope is that we can move this technique along to attract more interest from [industry] vendors."
MRI noninvasively images soft tissues based on measurements of how hydrogen nuclei—in the water that makes up much of the body—respond to magnetic fields. ULF-MRI enhances tissue contrast in particular types of MRI scans. Prostate tumors, for example, can be difficult to see with conventional MRI but show up clearly under ULF-MRI. ULF-MRI has also been used experimentally to image the brain, and tested in at least one nonmedical application, inspection of liquids at airports.
ULF-MRI also offers practical advantages: The instruments are simpler in design, lighter in weight and less expensive than regular MRI scanners. That's because ULF-MRI operates at much lower magnetic field strengths, measured in microteslas, thousands of times lower than conventional MRI, which operates at up to 3 teslas and requires huge magnets. The low magnetic field strength means ULF-MRI needs the most sensitive magnetometers available: SQUIDs (superconducting quantum interference devices). This is convenient because it makes ULF-MRI suitable for combining with other SQUID-based imaging techniques such as magnetoencephalography.
NIST staff previously designed phantoms for conventional MRI systems** and also have extensive experience both making and using SQUIDs. NIST's new ULF-MRI phantoms are short plastic cylinders, shaped like hockey pucks but a bit smaller, containing six or 10 plastic jars filled with various salt solutions that become magnetized in a magnetic field. Each phantom measures a different aspect of scanner performance such as spatial resolution. NIST researchers tested the new phantoms on both a conventional MRI system at the University of Colorado Health Sciences Center (Denver, Colo.) and an experimental ULF-MRI scanner at the University of California (UC) at Berkeley, where the technique was first demonstrated about a decade ago.
Tests results show the prototype phantoms are well-matched to ULF-MRI applications and allow direct comparison of ULF and clinical MRI system performance. NIST researchers now plan to incorporate design improvements based on lessons learned from the prototypes, with the aim of improving phantom stability and providing traceability to standard measurement units. NIST and UC Berkeley researchers also plan to work together to further develop ULF-MRI technology for detection of prostate and breast cancers.
NIST's phantoms for conventional MRI systems are currently being tested by hospitals and MRI manufacturers, and Sigma-K Corp. (Durham, N.C.) is developing methods for making copies for more widespread distribution under a NIST SBIR award.***
*M.A. Boss, J.A. Mates, S.E. Busch, Paul SanGiorgio, S.E. Russek, K. Buckenmaier, K.D. Irwin, H.M. Cho, G.C. Hilton and J. Clarke. Prototype phantoms for characterization of ultra-low field magnetic resonance imaging. Magnetic Resonance in Medicine. Paper published online Nov. 26, 2013. DOI: 10.1002/mrm.25060.
**See the 2010 NIST Tech Beat article, "Meet Phannie, NIST's Standard 'Phantom' for Calibrating MRI Machines," at www.nist.gov/pml/electromagnetics/phannie_051110.cfm.
***See the Aug. 31, 2012, NIST announcement, "NIST Announces 12 Small Business Innovation Research Awards" at www.nist.gov/public_affairs/releases/sbir-083112.cfm.
Media Contact: Laura Ost, firstname.lastname@example.org, 303-497-4880
NIST Demonstrates How Losing Information Can Benefit Quantum Computing
Suggesting that quantum computers might benefit from losing some data, physicists at the National Institute of Standards and Technology (NIST) have entangled—linked the quantum properties of—two ions by leaking judiciously chosen information to the environment.
Researchers usually strive to perfectly shield ions (charged atoms) in quantum computing experiments from the outside world. Any "noise" or interference, including heat generated by the experiment and measurements that cause fragile quantum states to collapse, can ruin data and prevent reliable logic operations, the conventional approach to quantum information processing.
Turning bug into feature, a collaboration of physicists from NIST and the University of Copenhagen in Denmark decided to think and work outside the box. They cleverly linked the experiment to the outside world to establish and maintain the entanglement of two ions. Entanglement is a curious feature of the quantum world that will be necessary to process and transport quantum data or correct errors in future quantum computers.
The new research is described in a Nature paper posted online Nov. 24,* along with similar work at Yale University using superconducting circuits.
"These new methods might be used to create entangled states that would be a resource in a traditional, logic-based quantum computer," NIST postdoctoral researcher John Gaebler says. "But there are also alternative architectures in which, for example, one couples a quantum computer to a specific noise environment and the resulting state of the computer contains the solution to the target problem."
The NIST experiments used two beryllium ions as quantum bits (qubits) to store quantum information and two partner magnesium ions, which were cooled with three ultraviolet laser beams to release heat.
The qubits were entangled by two ultraviolet laser beams and induced to "leak" any unwanted quantum states to the environment through continuous application of microwaves and one laser beam. The unwanted data were coupled to the outgoing heat in such a way that the qubits were left in only the desired entangled state—which happens to be the point of lowest motional energy, where no further heat and information is released to the environment.
Unlike a logic operation, the process can be started from any state of the ions and still yield the same final state. The scheme also can tolerate some kinds of noise that might cause a traditional logic gate to fail. For instance, the lasers and microwaves had no negative effects on the target entangled state but reshuffled any unwanted states.
All operations applied at the same time quickly drove the two qubits into a specific entangled state and kept them in that state most of the time. The qubits approached the target state within a few milliseconds and were found to be in the correct entangled state 75 percent of the time. The qubit state deteriorated slightly over longer times as the qubits were disturbed by errant laser emissions. By applying about 30 repetitions of the four steps in a particular order, scientists boosted the success rate to 89 percent in a separate experiment.
Co-authors of the paper include two collaborators at QUANTOP, The Niels Bohr Institute, University of Copenhagen. The work was supported in part by the Intelligence Advanced Research Projects Activity, Office of Naval Research, and the European Union's Seventh Framework Program.
*Y. Lin, J.P. Gaebler, F. Reiter, T.R. Tan, R. Bowler, A.S. Sorensen, D. Leibfried and D.J. Wineland. Dissipative production of a maximally entangled steady state. Nature. Posted online Nov. 24, 2013.
Media Contact: Laura Ost, email@example.com, 303-497-4880
MEP Awards Grants to Help Manufacturers "Make it in America"
The Hollings Manufacturing Extension Partnership (MEP) has awarded a total of $3.75 million in Make it in America grants to 10 MEP centers in nine states. The 3-year grant awards are in addition to the recently announced $20.5 million in Make it in America funding from the Department of Commerce's Economic Development Administration, the Department of Labor's Employment and Training Administration and the Delta Regional Authority. The overall objective of the Make it in America Challenge is to make it more attractive for businesses to build, continue, or expand their operations in the United States.
This collaboration allowed teams of organizations to submit a single application to receive funding from the four agencies, with each funding stream focused on a different aspect of a regional economic development strategy. The effort's goal is to accelerate job creation by encouraging reshoring by U.S. firms, fostering foreign direct investment, encouraging U.S. companies to keep or expand their businesses in the United States, building strong supply chains and training local workers. EDA's investments will help distressed regions build on existing assets to generate job growth by creating an environment conducive for businesses to establish and grow their operations in the United States. ETA's investments will help to develop a skilled workforce for specific industries. MEP's grants will help develop greater connectivity in regional supply chains and assist small and medium-sized enterprises.
"We're glad to include the expertise of our Manufacturing Extension Partnership Centers in the Make it in America Challenge, specifically to strengthen small manufacturers' capabilities to serve in critical supply chain improvements," said U.S. Secretary of Commerce Penny Pritzker. "These projects will strengthen multiple industrial sectors and support skilled workforces across the country, encouraging investment in the U.S."
The 10 winners of the Make it in America Challenge will pursue projects in nine states. The following MEP centers will each receive $125,000 per year for three years to support their regional Make it in America teams:
For more about the Make it in America Challenge, visit www.commerce.gov/news/fact-sheets/2012/09/25/fact-sheet-make-it-america-challenge.
Media Contact: Jennifer Huergo, firstname.lastname@example.org, 301-975-6343
Three NIST IT Staff Named To Top 15 "Forward-Thinking People in Government IT" List
Three employees in the Information Technology Laboratory at the National Institute of Standards and Technology (NIST) were chosen as among top "forward-thinking people working in government IT," according to FierceGovernmentIT, the publication that annually recognizes a group it calls the "Fierce 15."
Patrick Grother, Naomi Lefkovitz and Kevin Stine were selected by the magazine for handling "behind-the-scenes orchestration of some of the most progressive projects underway in government."
As the biometric testing project leader at NIST, Patrick Grother and his team focus on developing, evaluating, testing and providing guidelines for biometrics technologies for identification and authentication. Most recently, Grother and his team led research activities (documented in NIST Special Publication 800-76-2) on the use of biometrics in multifactor authentication, including images of the human iris to serve as a unique identifier, or biometric, on smart card credentials such as those used in federal government identity cards.
Kevin Stine, group leader of NIST's Security Outreach and Integration Group, was recognized for his lead role in orchestrating the "open public review and comment process" that President Obama called for in Executive Order 13636 "Improving Critical Infrastructure Cybersecurity." "From those five words, Kevin Stine crafted a program of aggressive stakeholder engagement on which even critics of the framework lavish praise," wrote the FierceGovernmentIT editors. The final draft of the Cybersecurity Framework is due to the president in February, 2013.
Media Contact: Evelyn Brown, email@example.com, 301-975-5661