In This Issue...
Locke, Chu Announce Significant Steps in Smart Grid Development
On May 18, 2009, U.S. Commerce Secretary Gary Locke and U.S. Energy Secretary Steven Chu announced significant progress that will help expedite development of a nationwide “smart” electric power grid.
A Smart Grid would replace the current, outdated system and employ real-time, two-way communication technologies to allow users to connect directly with power suppliers. The development of the grid will create jobs and spur the development of innovative products that can be exported. Once implemented, the Smart Grid is expected to save consumers money and reduce America’s dependence on foreign oil by improving efficiency and spurring the use of renewable energy sources.
Before it can be constructed, however, there needs to be agreement on standards for the devices that will connect the grid.
After chairing a meeting of industry leaders at the White House, Locke and Chu announced the first set of standards that are needed for the interoperability and security of the Smart Grid and $10 million in Recovery Act funds provided by the Energy Department to the Commerce Department’s National Institute of Standards and Technology (NIST) to support the development of interoperability standards.
Secretary Chu also announced that, based on feedback from the public and Smart Grid stakeholders, the Department of Energy (DOE) is increasing the maximum award available under the Recovery Act for Smart Grid programs. The maximum award available under the Smart Grid Investment Grant Program will be increased from $20 million to $200 million, and the maximum award for the Smart Grid Demonstration Projects will be increased from $40 million to $100 million. In making awards, DOE will ensure that funding is provided to small projects as well as end-to-end larger projects.
“President Obama has made a smart electrical grid a key element of his plan to lower energy costs for consumers, achieve energy independence and reduce greenhouse gas emissions,” Secretary Locke said. “Today, we took a significant step toward developing the standards necessary to realize the Smart Grid vision.”
“The Smart Grid is an urgent national priority that requires all levels of government as well as industry to cooperate,” Secretary Chu said. “I'm pleased that industry leaders stepped forward today and are working with us to get consensus. We still have much to do, but the ultimate result will be a much more efficient, flexible power grid and the opportunity to dramatically increase our use of renewable energy.”
The initial batch of 16 NIST-recognized interoperability standards will help ensure that software and hardware components from different vendors will work together seamlessly while securing the grid against disruptions.
Spanning areas ranging from smart customer meters to distributed power generation components to cybersecurity, the list of standards is based on the consensus expressed by participants in the first public Smart Grid Interoperability Standards Interim Roadmap workshop, held April 28-29 in Reston, Va.
The DOE also announced that the $10 million it received to support the development of interoperability standards under the American Recovery and Reinvestment Act has been transferred to NIST to help accelerate their efforts to coordinate these critical standards.
Public comments on the initial standards will be accepted for 30 days after their upcoming publication in the Federal Register. The date of publication will be posted on http://www.nist.gov/smartgrid/.
Comments may be submitted to email@example.com.
The Energy Department is the lead federal agency responsible for Smart Grid development. Creating national standards is a critical part of that process. Coordinating these standards and achieving industry buy-in is the responsibility of the Commerce Department. This meeting is part of an aggressive three-phase plan recently launched by the Commerce Department to expedite standards development.
Media Contact: Mark Bello, firstname.lastname@example.org, 301-975-3776
Congress Approves NIST's Recovery Plan
The National Institute of Standards and Technology (NIST) has received Congressional approval on its spending plan for the American Recovery and Reinvestment Act. NIST will use its funds to invest in construction projects, grants, scientific equipment and research fellowships.
“NIST’s investments will create jobs, revitalize the economy and advance American innovation,” U.S. Commerce Secretary Gary Locke said. The funds, Secretary Locke added, “will keep our nation at the forefront of cutting-edge science and technology.”
The NIST spending plan covers the $580 million in direct appropriations to NIST under the Recovery Act. An additional $30 million transferred to NIST from other federal agencies will be covered in a subsequent spend plan.
For the $220 million received for its Scientific and Technical Research and Services (STRS) activities, NIST will fund “research, competitive grants, additional research fellowships and advanced research and measurement equipment and supplies.” For instance, NIST will spend $119 million on high-value research and measurement equipment to be purchased through a competitive award process. It has set aside $35 million for competitive research grants for measurement science in NIST priority areas.
Of the $360 million NIST received for Construction of Research Facilities (CRF), $180 million will “address NIST’s backlog of maintenance and renovation and for construction of new facilities and laboratories.” For example, NIST will allocate $68.5 million to complete funding for a precision measurement laboratory at NIST’s site in Boulder, Colo. NIST will use $39 million to carry out safety, capacity, maintenance and major repair projects that will enhance the performance of NIST’s aging facilities.
With the other $180 million provided in the CRF appropriation, NIST will fund a competitive construction grants program to build research science buildings. NIST will award a portion of the grants to unfunded meritorious applications submitted under its fiscal year 2008 competition and a portion to a new 2009 competition.
For more examples of NIST Recovery Act spending, see http://www.nist.gov/public_affairs/releases/arra_050809.html and http://www.nist.gov/public_affairs/newsfromnist_recovery.html.
Media Contact: Ben Stein, email@example.com, 301-975-3097
Graphene Yields Secrets to Its Extraordinary Properties
Applying innovative measurement techniques, researchers from the Georgia Institute of Technology and the National Institute of Standards and Technology (NIST) have directly measured the unusual energy spectrum of graphene, a technologically promising, two-dimensional form of carbon that has tantalized and puzzled scientists since it was discovered in 2004.
Published in the May 15, 2009, issue of Science,* their work adds new detail to help explain the unusual physical phenomena and properties associated with graphene, a single layer of carbon atoms arrayed in a repeating, honeycomb-like arrangement.
Graphene’s exotic behaviors present intriguing prospects for future technologies, including high-speed graphene-based electronics that might replace today’s silicon-based integrated circuits and other devices. Even at room temperature, electrons in graphene are more than 100 times more mobile than in silicon.
Graphene apparently owes this enhanced mobility to the curious fact that its electrons and other carriers of electric charges behave as though they do not have mass. In conventional materials, the speed of electrons is related to their energy, but not in graphene. Although they do not approach the speed of light, the unbound electrons in graphene behave much like photons, massless particles of light that also move at a speed independent of their energy.
This weird massless behavior is associated with other strangeness. When ordinary conductors are put in a strong magnetic field, charge carriers like electrons begin moving in circular orbits that are constrained to discrete, equally spaced energy levels. In graphene these levels are known to be unevenly spaced because of the “massless” electrons.
The Georgia Tech/NIST team tracked these massless electrons in action, using a specialized NIST instrument to zoom in on the graphene layer at a billion times magnification, tracking the electronic states while at the same time applying high magnetic fields. The custom-built, ultra-low-temperature and ultra-high-vacuum scanning tunneling microscope allowed them to sweep an adjustable magnetic field across graphene samples prepared at Georgia Tech, observing and mapping the peculiar non-uniform spacing among discrete energy levels that form when the material is exposed to magnetic fields.
The team developed a high-resolution map of the distribution of energy levels in graphene. In contrast to metals and other conducting materials, where the distance from one energy peak to the next is uniformly equal, this spacing is uneven in graphene.
The researchers also probed and spatially mapped graphene’s hallmark “zero energy state,” a curious phenomenon where the material has no electrical carriers until a magnetic field is applied.
The measurements also indicated that layers of graphene grown and then heated on a substrate of silicon-carbide behave as individual, isolated, two-dimensional sheets. On the basis of the results, the researchers suggest that graphene layers are uncoupled from adjacent layers because they stack in different rotational orientations. This finding may point the way to manufacturing methods for making large, uniform batches of graphene for a new carbon-based electronics. The research was funded in part by the National Science Foundation, W.M. Keck Foundation and Semiconductor Research Corporation through the Nanoelectronics Research Initiative INDEX program, which NIST also supports.
* D.L. Miller, K.D. Kubista, G.M. Rutter, M. Ruan, W.A. de Heer, P.N. First and J.A. Stroscio. Observing the quantization of zero mass carriers in graphene. Science. May 15, 2009.
Media Contact: Mark Bello, firstname.lastname@example.org, 301-975-3776
NIST Engineers Discover Fundamental Flaw in Transistor Noise Theory
Chip manufacturers beware: There’s a newfound flaw in our understanding of transistor noise, a phenomenon affecting the electronic on-off switch that makes computer circuits possible. According to the engineers at the National Institute of Standards and Technology (NIST) who discovered the problem, it will soon stand in the way of creating more efficient, lower-powered devices like cell phones and pacemakers unless we solve it.
While exploring transistor behavior, the team found evidence that a widely accepted model explaining errors caused by electronic “noise” in the switches does not fit the facts. A transistor must be made from highly purified materials to function; defects in these materials, like rocks in a stream, can divert the flow of electricity and cause the device to malfunction. This, in turn, makes it appear to fluctuate erratically between “on” and “off” states. For decades, the engineering community has largely accepted a theoretical model that identifies these defects and helps guide designers’ efforts to mitigate them.
Those days are ending, says NIST’s Jason Campbell, who has studied the fluctuations between on-off states in progressively smaller transistors. The theory, known as the elastic tunneling model, predicts that as transistors shrink, the fluctuations should correspondingly increase in frequency.
However, Campbell’s group at NIST has shown that even in nanometer-sized transistors, the fluctuation frequency remains the same. “This implies that the theory explaining the effect must be wrong,” Campbell said. “The model was a good working theory when transistors were large, but our observations clearly indicate that it’s incorrect at the smaller nanoscale regimes where industry is headed.”
The findings have particular implications for the low-power transistors currently in demand in the latest high-tech consumer technology, such as laptop computers. Low-power transistors are coveted because using them on chips would allow devices to run longer on less power—think cell phones that can run for a week on a single charge or pacemakers that operate for a decade without changing the battery. But Campbell says that the fluctuations his group observed grow even more pronounced as the power decreased. “This is a real bottleneck in our development of transistors for low-power applications,” he says. “We have to understand the problem before we can fix it—and troublingly, we don’t know what’s actually happening.”
Campbell, who credits NIST colleague K.P. Cheung for first noticing the possibility of trouble with the theory, presented* some of the group’s findings at an industry conference on May 19, 2009, in Austin, Texas. Researchers from the University of Maryland College Park and Rutgers University also contributed to the study.
* J.P. Campbell, L.C. Yu, K.P. Cheung, J. Qin, J.S. Suehle, A. Oates, K. Sheng. Large Random Telegraph Noise in Sub-Threshold Operation of Nano-scale nMOSFETs. 2009 IEEE International Conference on Integrated Circuit Design and Technology. Austin, Texas. May 19, 2009; and Random Telegraph Noise in Highly Scaled nMOSFETs. 2009 IEEE International Reliability Physics Symposium, Montreal, Canada, April 29, 2009.
Media Contact: Chad Boutin, email@example.com, 301-975-4261
NIST Study Finds a Decade of High-Payoff, High-Throughput Research
In its first decade of work, a research effort at the National Institute of Standards and Technology (NIST) to develop novel and improved "combinatorial" techniques for polymer research—an effort that became the NIST Combinatorial Methods Center (NCMC)—realized economic benefits of at least $8.55 for every dollar invested by NIST and its industry partners, according to a new economic analysis.
The new study,* conducted for NIST by RTI International, estimates that from inception (1998) through 2007, the investment in the NCMC has yielded a social rate of return of about 161 percent (also known as the "internal rate of return" in corporate finance; a minimum acceptable IRR for a government research project is about 50 percent). RTI’s evaluation also found that the NCMC accelerated industry’s adoption of combinatorial methods by an average of 2.3 years.
RTI surveyed and interviewed polymer scientists at NCMC member institutions, as well as from the much larger body of research universities and chemical companies who benefited from the center’s research and outreach. The study found that the NCMC has had a significant impact on the development and use of combinatorial research methods for polymers and other organic materials, both in the development of novel techniques and data and the diffusion of research results to the larger polymers research community.
Started as a NIST pilot project in 1998 and formally established in 2002, the NCMC was conceived as a community effort—supported by NIST and industry membership fees—to develop methods for discovering and optimizing complex materials such as multicomponent polymer coatings and films, adhesives, personal care products and structural plastics. Indeed, the RTI report lauded the novel "open source" consortium model that NIST developed for the center as a main reason for its success and impact. The program has since branched out into nanostructured materials, organic electronics, and biomaterials.
The big idea behind combinatorial and high-throughput research is to replace traditional, piecemeal approaches to testing new compounds with methods that can synthesize and test large numbers of possible combinations simultaneously and systematically. NIST in particular pioneered the idea of "continuous gradient libraries," polymer test specimens whose properties change gradually and regularly from one side to the other so that the behavior of a huge number of possible mixtures can be evaluated at the same time. (For example, see http://www.nist.gov/public_affairs/techbeat/tb2006_0608.htm#designer "Designer Gradients Speed Surface Science Experiments," NIST Tech Beat, June 8, 2006, and http://www.nist.gov/public_affairs/techbeat/tb2007_0510.htm#wet "Wetter Report: New Approach to Testing Surface Adhesion," NIST Tech Beat, May 10, 2007.)
The NCMC has contributed advances in three major platform infratechnologies for creating combinatorial "libraries" for materials research—gradient thin films, and discrete libraries created by robotic dispensing systems and microfluidics systems—as well as several new high-throughput measurement methods and information technologies to manage the vast amount of data produced by combinatorial analysis. The center's research program is matched with an outreach effort to disseminate research results through open workshops and training programs.
The RTI report, Retrospective Economic Impact Assessment of the NIST Combinatorial Methods Center, is available online at www.nist.gov/director/prog-ofc/report09-1.pdf. The Web home page of the NIST Combinatorial Methods Center is at http://www.nist.gov/msel/polymers/combi.cfm.
* A.C. O’Connor, H.J. Walls, D.W. Wood and A.N. Link. Retrospective Economic Impact Assessment of the NIST Combinatorial Methods Center. Planning Report 09-1 prepared by RTI International, Research Triangle Park, N. C. for the National Institute of Standards and Technology, April 2009.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
NIST Defining the Expanding World of Cloud Computing
A working definition for cloud computing—a new computer technique with potential for achieving significant cost savings and information technology agility—has been released by a team of computer security experts at the National Institute of Standards and Technology (NIST). Since the federal government is considering cloud computing as a component of its new technology infrastructure, it is NIST’s role to evaluate it and then promote its effective and secure use within government and industry by providing technical guidance and developing standards.
The working definition of cloud computing described by NIST is “a pay-per-use model for enabling available, convenient and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” The draft working definition also describes five key characteristics, three delivery models and four deployment models.
The NIST cloud computing research team is studying cloud architectures, security and deployment strategies for the federal government. But its first task was to collaborate with industry and government to develop a working definition of cloud computing that will serve as a foundation for its research. The term “cloud computing” comes from the field’s standard use of drawing the Internet as a cloud in diagrams.
Security is always a concern with any new computer approach, and this one is no different. According to cloud research team leader Peter Mell, “Cloud computing has both security advantages and disadvantages. The cloud computing model inherently promotes availability of services through its distributed architecture model. However, this same model presents data confidentiality and integrity challenges by pooling hardware resources for use by multiple parties.”
The full working draft definition is available at http://csrc.nist.gov/groups/SNS/cloud-computing/index.html. Comments on the definition can be sent to email@example.com.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
NIST Helping Improve Speed Measurements for Cars, Bullets
While today's law enforcement officers don't wear utility belts full of crimefighting gadgets like Batman, they do rely on a variety of state-of-the-art technologies to do their jobs efficiently and safely. Two of these devices—down-the-road (DTR) radar used in speed enforcement and the ballistic chronograph, which measures the speed of bullets—soon should be more useful tools thanks to recent research conducted by the Office of Law Enforcement Standards (OLES) at the National Institute of Standards and Technology (NIST).
In a forthcoming paper in The Journal of Research of the National Institute of Standards and Technology,* researchers John Jendzurski and Nicholas Paulter examined the four methods commonly used by law enforcement officers to calibrate DTR radar devices and, for each one, determined the uncertainty it places on the measurement of a moving target. The calibration methods studied included a radar target simulator (an audio frequency source best used in a test facility and designed to mimic various speeds of a moving object), tuning forks (provide a range of audio frequencies that simulate different vehicle speeds and are easily used in the field), a calibrated speedometer (where the DTR measurement is dependent on accuracy of the test car’s speedometer) and a fifth wheel (using the measured speed of a wheel attached to the rear of the test vehicle instead of relying on the car’s speedometer).
Based on the data they obtained, the researchers developed and published uncertainty measurement formulas for each calibration method. These formulas will help DTR radar users clearly understand the impact of proper calibration for making accurate speed measurements.
In the second OLES publication, a paper in the April 2009 issue of Optical Engineering,** researchers Donald Larson and Nicholas Paulter developed a ballistic chronograph—an instrument used to measure of the velocity of a fired bullet—that is 20 times more precise than a typical manufacturer-provided chronograph. The new instrument has an uncertainty of only ± 0.2 meters per second compared to ± 4 meters per second for a bullet travelling 400 meters per second. The NIST chronograph may be used as a reference standard to calibrate and/or characterize the performance of chronographs available on the market. Law enforcement agencies and the military use chronographs during the testing of ballistic resistant body armors (commonly, but inaccurately, known as “bulletproof vests”) because their effectiveness is determined by how many bullets fired at specific velocities perforate or don’t perforate the protective gear.
* J. Jendzurski and N.G. Paulter. Calibration of Speed Enforcement Down-the-Road Radars. Journal of Research of the National Institute of Standards and Technology, Vol. 114, No. 3 (May-June 2009).
** N.G. Paulter and D.R. Larson. Reference Ballistic Chronograph. Optical Engineering, Vol. 48, No. 4 (April 2009).
Media Contact: Miichael E. Newman, email@example.com, 301-975-3025
NIST Validation Program Tests Next-Generation Internet Products
The National Institute of Standards and Technology (NIST) is establishing a testing program to assure that the U.S. government purchases new computers and networking products that work properly on the next-generation Internet traffic system while meeting standards for federal government use. Addressing the U.S. government standards profile known as USGv6, the testing program is heralded by a new publication, a preliminary set of tests and an upcoming meeting to discuss its management.
Every device that is connected to the Internet, from a supercomputer to a smart phone, has a unique numerical ID known as an Internet Protocol (IP) address. However, the number of computers and mobile devices with IP addresses is closing in on the nearly 5 billion address limit of the prevailing Internet Protocol, known as IP version 4 (IPv4). To meet the challenge, the computer industry is gradually moving to IP version 6 (IPv6), which is believed to have an inexhaustible address space. IPv6 can accommodate 2128, or more than 340 undecillion (a number followed by 36 zeros), sites. To imagine this, picture each known star in our universe with trillions of IP addresses.
The USGv6 profile also endeavors to “raise the bar” on the security capabilities of IPv6 devices connected to the Internet.
In its first step to prepare for the move to this new protocol, NIST in 2008 released a standards profile for U.S. government implementation of IPv6, which is now known as USGv6. The document, NIST Special Publication (NIST SP) 500-267, assists federal agencies in procuring USGv6 networking products.
NIST scientists, in concert with industry partners, have been developing testing procedures to assure that computers, routers and other equipment conform to, and are interoperable with, the IPv6 capabilities specified in the profile.
To help create the test infrastructure necessary to support broad USGv6 initiatives, NIST has taken three major steps: Publishing NIST SP 500-273, IPv6 Test Methods: General Description and Validation, a document describing the USGv6 Test Program procedures for validation and accreditation of test methods; developing a preliminary set of abstract test suites, and scheduling a meeting on May 27, 2009, to discuss program implementation issues.
“This testing regime is important,” explains NIST computer scientist Stephen Nightingale, “to ensure that future USGv6 procurements are both sufficiently capable and interoperable.”
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
First Labs Approved for Testing Communications Interoperability
The U.S. Department of Homeland Security's (DHS) Office for Interoperability and Compatibility (OIC) recently announced that it has recognized the first eight laboratories approved to test the interoperability of emergency communications equipment under the Project 25 Conformity Assessment Program (P25 CAP). P25 CAP, a joint effort of DHS and the National Institute of Standards and Technology (NIST), helps ensure that first responders, public safety officers and military personnel can always talk with each other no matter what communications equipment they are using. P25 CAP protocols were designed by NIST’s Office of Law Enforcement Standards (OLES). For more information and list of the eight P25 CAP-approved laboratories, go to http://www.safecomprogram.gov/SAFECOM/currentprojects/project25cap/.
Media Contact: Michael Baum, email@example.com, 301-975-2763