Take a sneak peek at the new NIST.gov and let us know what you think!
(Please note: some content may not be complete on the beta site.).
In This Issue...
Bon MOT: Innovative Atom Trap Catches Highly Magnetic Atoms
A research team from the National Institute of Standards and Technology (NIST) and the University of Maryland has succeeded in cooling atoms of a rare-earth element, erbium, to within two millionths of a degree of absolute zero using a novel trapping and laser cooling technique. Their recent report* is a major step towards a capability to capture, cool and manipulate individual atoms of erbium, an element with unique optical properties that promises highly sensitive nanoscale force or magnetic sensors, as well as single-photon sources and amplifiers at telecommunications wavelengths. It also may have applications in quantum computing devices.
The strongly counterintuitive technique of “laser cooling” to slow down atoms to very low speeds—temperatures close to absolute zero—has become a platform technology of atomic physics. Laser cooling combined with specially arranged magnetic fields—a so-called magneto-optical trap (MOT)—has enabled the creation of Bose-Einstein condensates, the capture of neutral atoms for experiments in quantum computing and ultra-precise time-keeping and spectroscopy experiments. The technique originally focused on atoms that were only weakly magnetic and had relatively simple energy structures that could be exploited for cooling, but two years ago** a NIST team showed that the far more complex energy structures of erbium, a strongly magnetic element, also could be manipulated for laser cooling.
The typical MOT uses a combination of six tuned laser beams converging on a point that is in a low magnetic field but surrounded by stronger fields. Originally, the lasers were tuned near a strong natural energy oscillation or resonance in the atom, a condition that provides efficient cooling but to only moderately low temperatures. In the new work, the research team instead used much gentler forces applied through a very weak resonance in order to bring erbium atoms to within a few millionths of a degree of absolute zero. Such weak resonances are only available in atoms with complex energy structures, and previously have been used only with a select group of non-magnetic atoms. When a strongly magnetic atom like erbium is used, the combination of strong magnetic forces and weak absorption of laser photons makes a traditional MOT unstable.
To beat this, the NIST/UM team turned classic MOT principles on their heads. Rather than shifting the laser frequency towards the red end of the spectrum—to impact fast, high-temperature atoms more than slow, cold ones—they shifted the laser towards the blue side to take advantage of the effects of the magnetic field on the highly magnetic erbium. Magnetism holds the atoms stably trapped while the lasers gently pushed them against the field, all the while extracting energy and cooling them. The delicate balancing act not only cools and traps the elusive erbium atoms, it does it more efficiently. The team’s modified trap design uses only a single laser and can cool erbium atoms to within two millionths of a degree of absolute zero. By contrast, a conventional MOT only brings rubidium atoms to about one ten-thousandth of a degree.
Erbium commonly is used in optical communications components for its convenient magneto-optical properties. The new trapping technique raises the possibility of using erbium and similar lanthanide elements for unique nanoscale magnetic field detectors, atomic resolution metrology, optical computing systems and quantum computing.
* A.J. Berglund, J.L. Hanssen and J.J. McClelland. Narrow-line magneto-optical cooling and trapping of strongly magnetic atoms. Physical Review Letters, V. 100, p. 113002, March 18, 2008.
** See “Laser Trapping of Erbium May Lead to Novel Devices” Tech Beat, April 28, 2006.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
More Solid than Solid: A Potential Hydrogen-Storage Compound
One of the key engineering challenges to building a clean, efficient, hydrogen-powered car is how to design the fuel tank. Storing enough raw hydrogen for a reasonable driving range would require either impractically high pressures for gaseous hydrogen or extremely low temperatures for liquid hydrogen. In a new paper* researchers at the National Institute of Standards and Technology’s Center for Neutron Research (NCNR) have demonstrated that a novel class of materials could enable a practical hydrogen fuel tank.
A research team from NIST, the University of Maryland and the California Institute of Technology studied metal-organic frameworks (MOFs). One of several classes of materials that can bind and release hydrogen under the right conditions, they have some distinct advantages over competitors. In principle they could be engineered so that refueling is as easy as pumping gas at a service station is today, and MOFs don’t require the high temperatures (110 to 500 C) some other materials need to release hydrogen.
In particular, the team examined MOF-74, a porous crystalline powder developed at the University of California at Los Angeles. MOF-74 resembles a series of tightly packed straws comprised of mostly carbon atoms with columns of zinc ions running down the inside walls. A gram of the stuff has about the same surface area as two basketball courts.
The researchers used neutron scattering and gas adsorption techniques to determine that at 77 K (-196 C), MOF-74 can adsorb more hydrogen than any unpressurized framework structure studied to date—packing the molecules in more densely than they would be if frozen in a block.
NCNR scientist Craig Brown says that, though his team doesn’t understand exactly what allows the hydrogen to bond in this fashion, they think the zinc center has some interesting properties.
“When we started doing experiments, we realized the metal interaction doesn’t just increase the temperature at which hydrogen can be stored, but it also increases the density above that in solid hydrogen,” Brown says. “This is absolutely the first time this has been encountered without having to use pressure.”
Although the liquid-nitrogen temperature of MOF-74 is not exactly temperate, it’s easier to reach than the temperature of solid hydrogen (-269 C), and one of the goals of this research is to achieve energy densities great enough to be as economical as gasoline at ambient, and thus less costly, temperatures. MOF-74 is a step forward in terms of understanding energy density, but there are other factors left to be dealt with that, once addressed, could further increase the temperature at which the fuel can be stored. Fully understanding the physics of the interaction might allow scientists to develop means for removing refrigeration or insulation, both of which are costly in terms of fuel economy, fuel production, or both.
The work was funded in part through the Department of Energy's Hydrogen Sorption Center of Excellence.
* Y. Liu, H. Kabbour, C.M. Brown, D.A. Neumann and C.C. Ahn. Increasing the density of adsorbed hydrogen with coordinatively unsaturated metal centers in metal-organic frameworks. Langmuir, ASAP Article 10.1021/la703864a. Published March 27, 2008.
Media Contact: Mark Esser, email@example.com, 301-975-2767
NIST Shows On-card Fingerprint Match Is Secure, Speedy
A fingerprint identification technology for use in Personal Identification Verification (PIV) cards that offers improved protection from identity theft meets the standardized accuracy criteria for federal identification cards according to researchers at the National Institute of Standards and Technology (NIST).
Under Homeland Security Presidential Directive 12 (HSPD 12), by this fall most federal employees and contractors will be using federally approved PIV cards to “authenticate” their identity when seeking entrance to federal facilities. In 2006 NIST published a standard* for the new credentials that specifies that the cards store a digital representation of key features or “minutiae” of the bearer’s fingerprints for biometric identification.
Under the current standard, a user seeking to enter a biometrically controlled access point would insert his or her PIV smart card into a slot—just like using an ATM card—and place their fingers on a fingerprint scanner. Authentication proceeds in two steps: the cardholder enters a personal identification number to allow the fingerprint minutiae to be read from the card, and the card reader matches the stored minutiae against the newly scanned image of the cardholder’s fingerprints.
In recent tests, NIST researchers assessed the accuracy and security of two variations on this model that, if accepted for government use, would offered improved features. The first** allows the biometric data on the card to travel across a secure wireless interface to eliminate the need to insert the card into a reader. The second*** uses an alternative authentication technique called “match-on-card” in which biometric data from the fingerprint scanner is sent to the PIV smart card for matching by a processor chip embedded in the card. The stored minutiae data never leave the card. The advantage of this, as computer scientist Patrick Grother explains, is that “if your card is lost and then found in the street, your fingerprint template cannot be copied.”
The NIST tests addressed two outstanding questions associated with match-on-cards. The first was whether the smart cards’ electronic “keys” can keep the wireless data transmissions between the fingerprint reader and the cards secure and execute the match operation all within a time budget of 2.5 seconds. The second question was whether the “match-on-card” operation will produce as few false acceptance and false rejection decisions as traditional match-off-card schemes where more computational power is available.
The researchers found that 10 cards with a standard 128-byte-long key and seven cards that use a more secure 256-byte key passed the security and timing test using wireless. On the accuracy side, one team met the criteria set by NIST and two others missed narrowly. The computer scientists plan a new round of tests soon to allow wider participation. For copies of the test report and details of the next test round, see the MINEX (Minutiae Interoperability Exchange Test) Phase II Web pages.
NIST has been at the forefront of security and biometric research and standardization for decades. Prior NIST work, in 2005, quantified the speed versus accuracy tradeoffs associated with storing an individual’s fingerprint minutiae rather than the full fingerprint images on PIV cards. These studies were funded by NIST and the Department of Homeland Security’s Science and Technology Directorate.
* Federal Information Processing Standard (FIPS) 201-1, Personal Identity Verification (PIV) of Federal Employees and Contractors. March, 2006.
** D. Cooper, H. Dang, P. Lee, W. MacGregor and K. Mehta. Secure Biometric Match-on-Card Feasibility Report NIST Interagency Report 7452, Nov., 2007.
*** P. Grother, W. Salamon, C. Watson, M. Indovina and P. Flanagan. MINEX II–Performance of Fingerprint Match-on-Card Algorithms, Phase II Report. NIST Interagency Report 7477, Feb. 29, 2008.
Edited on April 2, 2008 to add reference to Nov. 2007 report.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
Software Tackles Production Line Machine ‘Cyclic Jitters’
Electronic commands passed from machine to machine over data networks increasingly drive today’s precisely timed and sequenced manufacturing production lines. However timing irregularities in the signals from even one machine—a difference of only a tenth of a second from the expected—can result in havoc for manufacturing processes on the plant floor. The timing glitches, called “cyclic jitters,” can cause real jitters, making production machines jump or shake, damaging products, even shutting down assembly lines. National Institute of Standards and Technology (NIST) engineers have created a software program to help avoid that problem.
The NIST “EtherNet/Industrial Protocol (IP) Performance Test Tool” enables manufacturers to anticipate how certain machines will perform as part of their data communication system. Data from the tool also can provide vendors with information need to better tune the performance of their equipment.
Individual vendors often define the performance characteristics of network devices in different ways. These documentation differences make it difficult for manufacturers or plant engineers to compare high-speed data transmission characteristics of similar devices. To determine how different performance characteristics relate, they have to make time-consuming searches through vendor manuals or spend hours contacting vendor company engineers. Although standardized tests can indicate how well devices conform to communication specifications, until now manufacturers never could be sure how well the device actually would work under normal or abnormally heavy transmission conditions on the factory floor.
The EtherNet/IP Performance Test Tool collects device information from the user, generates a set of test scripts based on that information, analyses the performance data and reports the results to the user. The software package provides device transmission data for three different conditions: with no background electronic traffic; with small background traffic; and with more than 240 devices on the network.
NIST began working on the project at the urging of U.S. Council for Automotive Research (USCAR)’s Plant Floor Controllers Task Force and developed the program in conjunction with the Open DeviceNet Vendor Association (ODVA) under a Cooperative Research and Development Agreement (CRADA). ODVA, a vendor organization that maintains the DeviceNet and EtherNet/IP standards used extensively by the U.S. automotive industry, plans to begin using the test tool as part of a new performance testing laboratory service later this year.
Media Contact: John Blair, email@example.com, 301-975-4261
New NIST Publication Recommends Best Fits Between Federal ‘Locks’ and ‘Keys’
Federal agencies have begun issuing a secure form of government-wide ID known as the Personal Identity Verification (PIV) card, mandated in 2004 by Homeland Security Presidential Directive 12 (HSPD-12). The National Institute of Standards and Technology (NIST) has released, for public comment, a draft publication (SP 800-116) outlining best-practice guidelines for making the new cards work with the physical access control systems (PACS) that authenticate the cardholders in federal facilities.
The PIV card is intended to work everywhere across the federal government. Conventional PACS, however, are not fully enabled to work with PIV cards and are not interoperable between agencies. In addition, PACS need to verify the cardholder’s identity with an appropriate degree of confidence (denoted SOME, HIGH, OR VERY HIGH) depending upon the level of security needed at the particular location in the federal facility. Current PACS, however, may not be tailored to work at these graduated levels of “authentication assurance.”
The draft publication explores methods for verifying identity in a simple model describing four zones of increasing security in a facility. The zones are unrestricted (e.g., outside the fence or walls of the facility), controlled (e.g., inside the fence or front door), limited (e.g., past a security checkpoint for employees in a facility) and exclusion (secure areas granted to individuals for specific needs). It specifies increasingly sophisticated authentication mechanisms for these zones, from visual and CHUID authentication (inspection of the features on the front and back of the PIV card and reading a unique number from the card) to biometrics (for example, the use of distinguishing features in fingerprints known as “minutiae” to grant access) and PKI Authentication (exchange of cryptographic information that requires the user to enter a PIN number).
The report takes into account the many different types of federal facilities, from single agency buildings to multiple-agency campuses. It also explores how PACS systems can work with temporary ID cards for guest employees or visitors. The report is intended for government officials responsible for implementing HSPD-12. It covers the use of PIV cards with PACS, rather than the process for issuing them.
The draft document, Special Publication 800-116, “A Recommendation for the Use of PIV Credentials in Physical Access Control Systems (PACS),” is available from the NIST Computer Security Resource Center. The public comment period for the document ends May 12, 2008. Comments may be submitted via electronic mail at firstname.lastname@example.org or via regular mail at 100 Bureau Drive (Mail Stop 8930) Gaithersburg, MD 20899-8930.
Media Contact: Ben Stein, email@example.com, 301-975-3097
Michael D. Schroeder to Receive NIST/NSA Security Award
Michael D. Schroeder, a pioneer researcher in computer security, will be presented with the 2008 National Computer Systems Security Award by the National Institute of Standards and Technology (NIST) and the National Security Agency (NSA) on April 8 during the RSA Conference 2008 in San Francisco, Calif.
Established in 1988, the award recognizes individuals for scientific or technological breakthroughs, outstanding leadership, highly distinguished authorship or significant long-term contributions in the computer security field.
Schroeder has been the Assistant Director of Microsoft Research Silicon Valley since it opened in August 2001. He was the co-inventor of the Needham-Schroeder authentication protocol, a method for allowing two parties to mutually verify each other’s identity over an insecure computer network. The protocol has become the basis for industry standards, and is widely used today in commercial security products. In addition, Schroeder implemented a hardware version of the “ring of protection” mechanism, which allows different degrees of access to a CPU based on the level of “privilege” or trust to a user. His hardware implementation of the mechanism was used in the highly widespread Intel X86 series of microprocessors. As a professor at MIT starting in the 1960s, Schroeder was involved in the design of an influential early operating system known as Multics, the Multiplexed Information and Computing Service, which ran on a mainframe and operated on a time-shared basis among computer researchers. He later conducted research at the Xerox PARC Computer Science Lab and the Digital/Compaq Systems Research Center.
For more about Schroeder, please visit his Web site. More information about RSA Conference 2008 is available at the conference Web site.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
New Report Outlines Research Roadmap for Biomanufacturing
On-line process tools, improved sensor calibration and analytical and data analysis methods are among the technological challenges facing the biomanufacturing industry over the next 10 years to greatly increase manufacturing efficiency for protein drugs while ensuring safety, according to industry, government and academic experts.
The analysis was developed during a special brainstorming session as part of the annual meeting of the International Foundation for Process Analytical Chemistry (IFPAC) held at the end of January. Representatives from the National Institute of Standards and Technology (NIST), the National Institutes for Health (NIH), the U.S. Food and Drug Administration (FDA) and the Massachusetts Institute of Technology (MIT) co-chaired a session to develop a vision for the future of biomanufacturing that attracted representatives from pharmaceutical companies, vendors, academia, research and development labs and other interested parties.
These improved measurement standards and technologies are essential initial steps towards the group’s “blue sky vision” for the future of biomanufacturing. In this vision, bioreactors will mimic the cellular signaling and regulation systems of higher order living systems by regulating nutrients and waste with built in mechanisms to allow real-time analysis and control of fermentation, monitor metabolism and impact of variables within the system.
The group detailed some of the challenges facing the industry over the next 10 years and announced their intention to stimulate discussion and cooperation among stakeholders, including the federal government, academia, and research interests, as a way to address those technological and scientific hurdles.
The team also planned future sessions for the rest of the year, including conferences and site visits scheduled for the summer and fall of 2008 and further workshops to set research priorities during the fall of 2008. To read the full report, see IFPAC 2008: A 10 year Vision for Biotechnology Manufacturing Session.
Media Contact: Michael Baum, email@example.com, 301-975-2763
NIST Third Interoperability Week April 28-May 2
For manufacturing engineers and linguists “interoperability” and “ontology” go together like horse and carriage—“you can’t have one without the other.” The matchup will be evident at the Third Annual Interoperability Week hosted by the National Institute of Standards and Technology (NIST) from April 28 to May 2 in Gaithersburg, Md. Researchers in the two fields will respectively tackle emerging standards for transmitting data from machines that use different software programs (interoperability) and formal rules for understanding the meaning of terms in data and their relationships (ontology).
An Ontology Summit (April 28-29) opens the workshop week with a discussion of issues surrounding repositories of ontologies, such as management, quality assurance and reuse.
The spring meeting of the Open Applications Group Inc. (OAGi, April 29-30) focuses on the use of XML-based standards to promote business process interoperability. As part of the meeting, NIST researchers will present their work to perfect validation tools for OAGi standards, including an automatic validator for OAGi naming and design rules.
The Collaborative Expedition workshop (April 30) will explore identity management principally in the fields of homeland security and cyber infrastructure.
At a Sensor Standards Harmonization Working Group (SSHWG, April 1) experts from the government, industry and academia will discuss opportunities to harmonize sensor-related standards such as the IEEE Smart Sensor Standard (IEEE 1451) and sensor-related standards from the Open Geospatial Consortium.
Finally, a 2D and 3D Content Representation, Analysis and Retrieval workshop (May 1-2) will allow researchers to present their advances in image processing, image analysis, shape analysis, indexing, data mining, metadata, ontology, interoperability tools, bench markers and evaluation methodologies.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763