NIST logo

Tech Beat - April 26, 2011

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: April 26, 2011
Date Modified: April 26, 2011 
Contact: inquiries@nist.gov

Good Eggs: NIST Nanomagnets Offer Food for Thought About Computer Memories

Magnetics researchers at the National Institute of Standards and Technology (NIST) colored lots of eggs recently. Bunnies and children might find the eggs a bit small—in fact, too small to see without a microscope. But these "eggcentric" nanomagnets have another practical use, suggesting strategies for making future low-power computer memories.

eggs
Collage of NIST "nano-eggs" — simulated magnetic patterns in NIST’s egg-shaped nanoscale magnets.
Credit: Talbott/NIST

For a study described in a new paper,* NIST researchers used electron-beam lithography to make thousands of nickel-iron magnets, each about 200 nanometers (billionths of a meter) in diameter. Each magnet is ordinarily shaped like an ellipse, a slightly flattened circle. Researchers also made some magnets in three different egglike shapes with an increasingly pointy end. It's all part of NIST research on nanoscale magnetic materials, devices and measurement methods to support development of future magnetic data storage systems.

It turns out that even small distortions in magnet shape can lead to significant changes in magnetic properties. Researchers discovered this by probing the magnets with a laser and analyzing what happens to the "spins" of the electrons, a quantum property that's responsible for magnetic orientation. Changes in the spin orientation can propagate through the magnet like waves at different frequencies. The more egg-like the magnet, the more complex the wave patterns and their related frequencies. (Something similar happens when you toss a pebble in an asymmetrically shaped pond.) The shifts are most pronounced at the ends of the magnets.

To confirm localized magnetic effects and "color" the eggs, scientists made simulations of various magnets using NIST's object-oriented micromagnetic framework (OOMMF).** (See graphic.) Lighter colors indicate stronger frequency signals.

The egg effects explain erratic behavior observed in large arrays of nanomagnets, which may be imperfectly shaped by the lithography process. Such distortions can affect switching in magnetic devices. The egg study results may be useful in developing random-access memories (RAM) based on interactions between electron spins and magnetized surfaces. Spin-RAM is one approach to making future memories that could provide high-speed access to data while reducing processor power needs by storing data permanently in ever-smaller devices. Shaping magnets like eggs breaks up a symmetric frequency pattern found in ellipse structures and thus offers an opportunity to customize and control the switching process.

"For example, intentional patterning of egg-like distortions into spinRAM memory elements may facilitate more reliable switching," says NIST physicist Tom Silva, an author of the new paper.

"Also, this study has provided the Easter Bunny with an entirely new market for product development."

* H.T. Nembach, J.M. Shaw, T.J. Silva, W.L. Johnson, S.A. Kim, R.D. McMichael and P. Kabos. Effects of shape distortions and imperfections on mode frequencies and collective linewidths in nanomagnets. Physical Review B 83, 094427, March 28, 2011.
** See http://math.nist.gov/oommf/.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

Travel Hazards: Two Studies Start to Map Pollutant Threats to Turtles

In a pair of studies—one recently published online* and the other soon-to-be published**— researchers at the Hollings Marine Laboratory (HML), a government-university collaboration in Charleston, S.C., report that persistent organic pollutants (POPs) are consistently showing up in the blood and eggs of loggerhead sea turtles, that the turtles accumulate more of the contaminant chemicals the farther they travel up the Atlantic coast, and that the pollutants may pose a threat to the survival of this endangered species.

turtle
Adult male loggerhead turtle fitted with a transmitter for satellite tracking of migratory patterns.
Credit: J. Keller, NIST
View hi-resolution image

POPs are a large group of man-made chemicals that, as their name indicates, persist in the environment. They also spread great distances through air and water, accumulate in human and animal tissues, infiltrate food chains, and may have carcinogenic and neurodevelopmental effects. POPs include banned substances such as DDT and toxaphenes, once used as pesticides; polychlorinated biphenyls (PCBs), once used as insulating fluids; and polybrominated diphenyl ethers (PBDEs), once used as flame retardants. While POPs have been recognized for many years as a health threat to loggerhead turtles (Caretta caretta), there is little scientific data available to help understand the nature and scope of the risk.

turtle
Up close and personal with a loggerhead turtle (Caretta caretta) in the Gulf of Mexico's Flower Garden Banks National Marine Sanctuary (about 170 kilometers, or 100 miles, off the Louisiana coast).
Credit: T. Moore, NOAA
View hi-resolution image

"This uncertainty makes it difficult for wildlife conservation managers to make informed decisions about how best to assist the recovery of the loggerhead species," says National Institute of Standards and Technology (NIST) researcher Jennifer Keller. "Our recent studies provide some of the first measurements of POP levels in adult male and nesting female loggerhead turtles at various locations across their migratory range."

In the first study*, HML researchers from NIST and the College of Charleston (C of C), working with the South Carolina Department of Natural Resources (SCDNR), used satellites to track 19 adult male loggerheads that had been captured in 2006 and 2007 by the SCDNR near Port Canaveral, Fla., fitted with transmitters on their backs and then released back into the wild. The animals, whose blood had been drawn at the time of capture and analyzed for POP concentrations, were followed for at least 60 days to learn their travel patterns. Ten turtles travelled north along the Atlantic coast, eventually migrating to ocean shelf waters between South Carolina and New Jersey. The other nine remained residents in Florida.

Blood plasma concentrations for all of the POPs examined were higher in the transient loggerheads, suggesting that they had eaten prey that were contaminated, such as crabs, in the northern latitudes during previous migrations. Additionally, the loggerheads that travelled farthest north had the highest POP concentrations in their systems. "This may be because the turtles' northern feeding grounds are subjected to higher levels of POPs from areas more populated and more industrialized than those in Florida," says C of C researcher Jared Ragland.

In the other HML turtle study**, Keller and researchers from the National Oceanic and Atmospheric Administration (NOAA), Florida Atlantic University and Duke University measured a large suite of POPs in loggerhead egg yolk samples collected from 44 nests in western Florida, eastern Florida and North Carolina. The team found that POP concentrations were lowest in western Florida, at intermediate levels in eastern Florida and highest in North Carolina.

"This, we believe, can be partly explained by the foraging site selections of nesting females," Keller says. "Turtles that nest in western Florida forage in the Gulf of Mexico and the Caribbean Sea where POP contamination is apparently lower than along the Atlantic coast of the United States where the North Carolina nesters forage, whereas the eastern Florida nesting females forage in areas that overlap the two in terms of geography and POP levels."

The HML is a unique partnership of governmental and academic agencies including NIST, NOAA's National Ocean Service, the SCDNR, the C of C and the Medical University of South Carolina.

* J.M. Ragland, M.D. Arendt, J.R. Kucklick and J.M. Keller. "Persistent organic pollutants in blood plasma of satellite-tracked adult male loggerhead turtles (Caretta caretta). Environmental Toxicology and Chemistry, Vol. 30, No. 5, May 2011 (published online Apr. 20, 2011).
** J.J. Alava, J.M. Keller, J. Wyneken, L. Crowder, G. Scott and J.R. Kucklick. "Geographical variation of persistent organic pollutants in eggs of threatened loggerhead sea turtles (Caretta caretta) from southeastern USA." Environmental Toxicology and Chemistry (accepted for publication Apr. 12, 2011).

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

Two Graphene Layers May Be Better Than One

Researchers at the National Institute of Standards and Technology (NIST) have shown that the electronic properties of two layers of graphene vary on the nanometer scale. The surprising new results reveal that not only does the difference in the strength of the electric charges between the two layers vary across the layers, but they also actually reverse in sign to create randomly distributed puddles of alternating positive and negative charges. Reported in Nature Physics,* the new measurements bring graphene a step closer to being used in practical electronic devices.

graphene layers
NIST measurements show that interactions of the graphene layers with the insulating substrate material causes electrons (red, down arrow) and electron holes (blue, up arrow) to collect in "puddles". The differing charge densities creates the random pattern of alternating dipoles and electon band gaps that vary across the layers.
Credit: NIST
View hi-resolution image

Graphene, a single layer of carbon atoms, is prized for its remarkable properties, not the least of which is the way it conducts electrons at high speed. However, the lack of what physicists call a band gap—an energetic threshold that makes it possible to turn a transistor on and off—makes graphene ill-suited for digital electronic applications.

Researchers have known that bilayer graphene, consisting of two stacked graphene layers, acts more like a semiconductor when immersed in an electric field.

According to NIST researcher Nikolai Zhitenev, the band gap may also form on its own due to variations in the sheets' electrical potential caused by interactions among the graphene electrons or with the substrate (usually a nonconducting, or insulating material) that the graphene is placed upon.

NIST fellow Joseph Stroscio says that their measurements indicate that interactions with the disordered insulating substrate material causes pools of electrons and electron holes (basically, the absence of electrons) to form in the graphene layers. Both electron and hole "pools" are deeper on the bottom layer because it is closer to the substrate. This difference in "pool" depths, or charge density, between the layers creates the random pattern of alternating charges and the spatially varying band gap.

Manipulating the purity of the substrate could give researchers a way to finely control graphene's band gap and may eventually lead to the fabrication of graphene-based transistors that can be turned on and off like a semiconductor.

Still, as shown in the group's previous work**, while these substrate interactions open the door to graphene's use as a practical electronic material, they lower the window on speed. Electrons do not move as well through substrate-mounted bilayer graphene; however, this may likely be compensated for by engineering the graphene/substrate interactions.

Stroscio's team plans to explore further the role that substrates may play in the creation and control of band gaps in graphene by using different substrate materials. If the substrate interactions can be reduced far enough, says Stroscio, the exotic quantum properties of bilayer graphene may be harnessed to create a new quantum field effect transistor.

* G. Rutter, S. Jung, N. Klimov, D. Newell, N. Zhitenev and J. Stroscio. Microscopic polarization in bilayer graphene. Nature Physics. Published online April 24, 2011.
** "See the Jan. 19, 2011, Tech Beat article "Real-World Graphene Devices May Have a Bumpy Ride" at www.nist.gov/public_affairs/tech-beat/tb20110119.cfm#graphene.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

NIST Prototypes Framework for Evaluating Sustainability Standards

As manufacturers and other businesses step up efforts to cut waste, reduce energy use and improve the overall sustainability of their products and processes, the number of planet-friendly standards and regulations also is increasing at a rapid clip, creating a sometimes-confusing array of options for “going green.” National Institute of Standards and Technology (NIST) researchers have prototyped a framework to help organizations of all types sort through the welter of choices and evaluate and implement sustainability standards most appropriate for their operations and interests.

The NIST team will unveil their framework for analyzing sustainability standards on May 4 at the 18th CIRP International Conference on Life Cycle Engineering in Braunschweig, Germany.*

Many incentives—some are carrots, others are sticks—motivate businesses to improve their sustainability performance. These range from bottom-line concerns, like cutting costs and reducing scrap, to compliance with regulatory and customer requirements to good corporate citizenship. Whatever the drivers, businesses are boosting their sustainability efforts. In a recent international survey of more than 3,000 business executives and managers, nearly 70 percent said their organizations would increase their investments in sustainability this year.

As they plan and implement their efforts, businesses often implement sustainability standards as best practices. But which ones should they adopt?

“Despite their noble intentions, the ever-growing number of voluntary and regulatory standards related to sustainability makes it difficult to select standards well suited for a particular product line,” explains NIST computer scientist Rachuri Sudarsan, lead author of the paper. “Small and medium-size enterprises, in particular, face challenges in identifying the standards that warrant their time and resources.”

They need to understand and determine what to measure, how to measure, how to report the results, and how to verify and validate the reported data, he explains.

To help answer these questions, the NIST team adapted the so-called Zachman framework, a formal approach developed in the 1980s to define organizational structures and to classify and organize specifications and data accordingly. More recently, the Zachman framework has been used to describe and categorize complex health-care and cyber security standards.

With the NIST-customized framework, stakeholders can view individual sustainability standards from their particular perspective, such as that of a manufacturer, software supplier, regulator or consumer. Complex standards are broken down into six different levels of detail—from the contextual view of a planner down to the actual data to collect and use—and distilled into categories to answer six questions: what, how, when, who, where and why. Results are arrayed in an easy-to-scan, 36-cell matrix.

NIST is piloting testing the framework on its new Sustainability Standards Portal, or SSP (www.mel.nist.gov/msid/SSP/). Also a work in progress, the SSP presents and distills information on a wide range of voluntary and regulatory sustainability standards. For many of these standards, stakeholder requirements have been identified and described. The portal contains an example of the results of an analysis of a regulatory standard (the European Union’s Restriction on the Use of Hazardous Substances Directive) using the NIST-customized version of the Zachman framework.

*S. Rachuri, P. Sarkar, A, Narayanan, J-H Hyun Lee and P. Witherell. Towards a methodology for analyzing sustainability standards using the Zachman framework. 18th CIRP International Conference on Life Cycle Engineering. Braunschweig, May 2-4, 2011. Available at http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907401.

Media Contact: Mark Bello, bello@nist.gov, 301-975-3776

Comment  Comment on this article.back to top

Understanding How Glasses 'Relax' Provides Some Relief for Manufacturers

Researchers at the National Institute of Standards and Technology (NIST) and Wesleyan University have used computer simulations to gain basic insights into a fundamental problem in material science related to glass-forming materials, offering a precise mathematical and physical description* of the way temperature affects the rate of flow in this broad class of materials—a long-standing goal.

collage
Battery acid, plastic containers and windowpanes are among the many glassy materials whose molecular properties the new study quantifies. Application of the findings could help manufacturers improve the design of such materials from the ground up.
images ©Shutterstock/collage K. Talbott
View hi-resolution image

Manufacturers who design new materials often struggle to understand viscous liquids at a molecular scale. Many substances including polymers and biological materials change upon cooling from a watery state at elevated temperatures to a tar-like consistency at intermediate temperatures, then become a solid "glass" similar to hard candy at lower temperatures. Scientists have long sought a molecular-level description of this theoretically mysterious, yet common, "glass transition" process as an alternative to expensive and time-consuming trial-and-error material discovery methods. Such a description might permit the better design of plastics and containers that could lengthen the shelf life of food and drugs.

A fundamental question is why many materials behave differently when temperature changes. In some "fragile" glass-forming materials, a modest variation in temperature can make the material change from highly fluid to extremely viscous, while in "strong" fluids this change in viscosity is much more gradual. This effect influences how long a manufacturer has to work with a material as it cools. "For decades, material scientists have heavily relied on empirical rules of thumb to characterize these materials," says NIST theoretician Jack Douglas. "But if you want to design a material that does precisely what you want, you need a molecular understanding of the underlying physical processes involved."

According to Douglas, the increasingly viscous nature of glass-forming liquids is related to molecules that move together in long strings around other atoms that are almost frozen in their motion. The growth of these snake-like structures leads to an increase in the viscosity of the liquid: the lower the temperature, the longer the chains, and the more viscous the fluid. The team found that the rate at which these spontaneously organizing snake-like strings grow in size as the material cools is quantitatively related mathematically to the fluid fragility—confirming intuitive arguments made nearly half a century ago by physicists G. Adams and J.H. Gibbs, but now bolstering them with a firm computational underpinning.

Douglas and his collaborator Francis Starr of Wesleyan University achieved a large variation of fluid fragility through use of a computer model, which mimics a polymer fluid that includes tiny nanometer-sized particles. Portraying the addition of various amounts of nanoparticles and varying their interaction with the polymers, Starr says, gave the team a sort of "knob to tweak" to reveal how the fluidity changed with temperature and how the motion of the clusters was quantitatively related to changes in the fluid's properties. This tuning of cooperative motion in glass-forming liquids and fragility should be crucial in material design. Douglas says.

* F.W. Starr and J.F. Douglas. Modifying fragility and collective motion in polymer melts with nanoparticles. Physical Review Letters, week ending March 18, 2011, Vol. 106, 115702 pp. 1-4, DOI: 10.1103/PhysRevLett.106.115702.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

NIST Seeks Improved Recovery of Samples from Biohazard Events

It may not be as catchy a combination as "Miss Scarlet in the parlor with a revolver," but "polyester-rayon wipes in the field followed by saline-surfactant extraction and vortexing" is the most efficient solution to an important biological game of "Clue" deduced by researchers at the National Institute of Standards and Technology (NIST). As reported in a recent paper*, the NIST team studied different methods for collecting, extracting and quantifying microbial spores from indoor surfaces to estimate parameters that should be considered in the development of a standard biological sampling protocol. A precise and reliable recovery system is critical to evaluating the success of cleanup efforts following an accidental biohazard contamination or bioterrorist attack (such as the Bacillus anthracis spore-laden letters sent to Congress and elsewhere in 2001).

anthrax spores
This scanning electron micrograph shows spores from the anthrax vaccine strain of Bacillus anthracis. These spores can live for many years, enabling the bacteria to survive in a dormant state. The spores range in size from 0.5 to 2 micrometers (millionth of a meter).
Credit: J.H. Carr, Centers for Disease Control and Prevention
View hi-resolution image

Technique matters. The NIST team found that the optimal method in their experiments was about 10 times more effective at recovering spores than the worst method, and that recovery dramatically improved by simply adding a surfactant to the extraction solution.

"A comprehensive look at the impact of protocol variables affecting the performance of spore recovery—especially when dealing with serious threats such as anthrax—is an important national homeland security priority," says lead researcher Sandra Da Silva. "In this study, we used Bacillus anthracis (Sterne), a vaccine strain, as a surrogate for Bacillus anthracis (Ames), the causative agent of the anthrax disease, in order to learn how to improve the low efficiency of current protocols."

Possible explanations for the poor results, Da Silva says, include using wipes made from materials that do not attract bacteria well or won't let go of collected bacteria, extraction liquids that fail to completely loosen spores from the wipes and get them into solution, or inadequate dissociation methods that leave spores clinging to the walls of the recovery container. To determine how to avoid these obstacles and maximize the quantity and quality of the spores recovered, the NIST researchers studied different combinations of the three experimental factors: the type of wipe material (polyester, cotton or polyester-rayon), the type of extraction solution (deionized water; deionized water with Tween 80 surfactant; phosphate-buffered solution (PBS) or PBS with Tween 80) and the method used to agitate the solution to loosen the spores from the wipe after collection.

Polyester-rayon proved to be the most efficient material for the spore collection wipes, a result that the researchers suspect is due to the chemistry and structure of the fibers. Extraction solutions with a surfactant worked better at removing spores from the wipes and getting them into solution. This, the researchers believe, is because the detergent-like character of the surfactant loosens the bond between the spores and the wipe surface similar to the way soap loosens dirt from skin. The surfactant also coated the sides of the recovery container, helping keep the spores from sticking there. Finally, the researchers observed that vortexing (mixing by rapid spinning) physically removed more spores than sonication (sound waves applying force).

Of the three protocol variables studied, the most critical to a successful recovery of B. anthracis (Sterne) spores is the type of extraction solution used, Da Silva says.

The findings from this study are being used to develop an ASTM protocol for bacterial surface sampling. Additionally, the team is conducting a similar study with microbes that are more sensitive to environmental conditions, including Gram-positive (Bacillus cereus) and Gram-negative (Escherichia coli and Burkholderia thailandensis) bacteria.

* S.M. Da Silva, J.J. Filliben and J.B. Morrow. Parameters affecting spore recovery from wipes used in biological surface sampling. Applied and Environmental Microbiology, Vol. 77, No. 7, pages 2374-2380, April 2011.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

White House Launches Plan to Create a Trusted 'Identity Ecosystem' for On-Line Commerce

On April 15, the Obama Administration formally launched its National Strategy for Trusted Identities in Cyberspace (NSTIC), a plan to work with the private sector to develop a private market for secure identity credentials for the Internet.

The plan calls for establishment of an "Identity Ecosystem," in which consumers can choose to obtain "trusted" IDs from one or more private or public credential providers. Consumers can then use their credentials to prove their identity when they're carrying out sensitive transactions, like banking, while staying anonymous when they're not.

"By making online transactions more trustworthy and better protecting privacy, we will prevent costly crime, we will give businesses and consumers new confidence, and we will foster growth and untold innovation. That's why this initiative is so important for our economy," President Obama said in making the announcement.

The NSTIC system would work by creating a set of standards for privacy protection and interoperability of on-line credentials based on cryptography and other techniques such as multi-factor authentication. For example, student Jane Smith could get a digital credential from her cell phone provider and another one from her university and use either of them to log into her bank, her e-mail, her social networking site and so on, all without having to remember dozens of passwords. If she uses one of these credentials to log into her Web email, she could use only her pseudonym, "Jane573." If, however, she chose to use the credential to log-in to her bank she could prove that she is truly Jane Smith. People and institutions could have more trust online because all participating service providers will have agreed to consistent standards for identification, authentication, security and privacy.

Fully implemented, such a system would reap multiple benefits for businesses and consumers alike, officials said. Consumers would benefit from the increased security of the system and the reduced threat of fraud and identity theft, since their personal information would be less exposed and on-line transactions more secure. Businesses would benefit from reduced costs in managing and protecting client information, and could focus on service and new product development.

The April 15th White House announcement, "Administration Releases Strategy to Protect Online Consumers and Support Innovation and Fact Sheet on National Strategy for Trusted Identities in Cyberspace," is available at www.whitehouse.gov/the-press-office/2011/04/15/administration-releases-strategy-protect-online-consumers-and-support-in. Additional information on the National Strategy for Trusted Identities in Cyberspace (NSTIC) is available at www.nist.gov/nstic/.

Media Contact: Ben Stein, ben.stein@nist.gov, 301-975-3097

Comment  Comment on this article.back to top

NIST Offering Free Access to Standards for First Responders

The National Institute of Standards and Technology (NIST) Office of Law Enforcement Standards (OLES), in collaboration with the NIST National Center for Standards and Certification Information, has launched an Internet pilot project to measure the U.S. emergency and first responder communities’ need for documentary standards. As part of the study, NIST will offer U.S. first responders free access to documentary standards published by ASTM International, IEEE and the National Fire Protection Association during summer 2011.

Documentary standards can specify product characteristics, establish accepted test methods and procedures, characterize materials, define processes and systems, or specify knowledge, training and competencies for specific tasks. First responders use basic, testing and product standards primarily to determine the fitness/interoperability of their equipment for the work required. Typical subjects include equipment such as body armor, communications systems and biometric ID systems.

This pilot will contribute to a better understanding of whom in the federal, state and local first responder communities needs access to standards, what types of standards they use the most, and how OLES can better serve this community with future research and development. OLES chose the three standards-developing organizations because they publish a large number of responder-relevant standards, such as the Code of Federal Regulations, Department of Defense Military Specifications, etc.

The OLES project features approximately 300 standards, and OLES will analyze information on the agencies that use the documents. This pilot project is restricted to individuals with a “.gov” or “.mil” e-mail address in order to profile which agencies have the greatest demand. OLES will share the data gathered through the pilot with the agencies as well as with the standards-developing organizations.

Since 1971, the law enforcement, fire, emergency medical service and security communities have relied on OLES for guidance on procurement, deployment, applications, operations and training decisions. OLES assists U.S. federal, state and local agencies ensure that the safety and emergency equipment they need and rely on is safe, dependable and effective, and based on sound scientific technologies.

OLES develops equipment performance standards, measurement tools, operating procedures and usage guidelines that help public agencies’ select criteria for their equipment procurement, deployment, operations and training applications.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

Smart Grid Panel Agrees on Standards for Wireless Communication, Meter Upgrades

The governing board of the Smart Grid Interoperability Panel (SGIP) has voted in favor of a new standard and a set of guidelines important for making the long-planned “smart” electricity grid a reality. The two documents address the need for wireless communications among grid-connected devices as well as the ability to upgrade household electricity meters as the Smart Grid evolves.

The documents were identified by the SGIP along with other standards development projects called “Priority Action Plans,” or PAPs, that describe critical needs for realizing an energy-efficient, modern power grid with seamlessly interoperable parts. The SGIP, a group of public and private organizations, was created by the National Institute of Standards and Technology (NIST) to coordinate development of consensus-based Smart Grid standards.

Almost every house has an electricity meter, and the “Meter Upgradeability Standard” (PAP 0) is designed to ensure that the new generation of smart electricity meters does not become obsolete. According to Paul Molitor, Industry Director for Smart Grid at the National Electrical Manufacturer’s Association, PAP 0 aims to “future-proof” these meters.

“More than 50 million houses across the country will need new meters for the Smart Grid to function, and PAP 0 will ensure that this substantial upfront investment of time and money is protected,” Molitor said. “PAP 0 makes it possible to upgrade any meter as the standards evolve, and to do so remotely.”

The “Guidelines for Assessing Wireless Communications for Smart Grid Applications” (PAP 2) covers standards necessary for wireless communications between all devices connected to the Smart Grid—not just the meters on your house, but the wide range of components in generation plants, substations and transmission systems necessary to keep energy flowing among the myriad points on the grid.

“Technologies like Wi-Fi and Bluetooth were not designed with Smart Grid in mind,” says NIST’s Nada Golmie. “What PAP 2 does is ensure that any technologies that we use—whether off-the-shelf or not—will provide the features the grid needs.”

Golmie says that—to give one example—there can be far less tolerance of delays between transmission and reception or interruption of signals among grid devices than there is among general data communication devices, such as cell phones. PAP 2’s goal is to specify wireless technology performance that is grid-worthy.

“We would like vendors and standard-setting organizations to become aware of the features a grid-worthy technology will have,” she says. “We’re trying to help facilitate a conversation between technology developers and grid operators, to ensure they are all on the same page. It’s hard to do that without hard numbers about how devices must perform, and PAP 2 provides these numbers.”

For more details, see the NIST April 19, 2011, release “Smart Grid Panel Agrees on Standards and Guidelines for Wireless Communication, Meter Upgrades” at www.nist.gov/smartgrid/smartgrid-041911.cfm.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

New Manufacturing Innovations Blog Lifts Off

The manufacturing experts at the National Institute of Standards and Technology (NIST) Hollings Manufacturing Extension Partnership are now spreading the word on manufacturing innovation by blog. Launched April 4, the official MEP Blog: Manufacturing Innovations (http://nistmep.blogs.govdelivery.com/) will serve as a focal point for educating U.S. manufacturers, partners and stakeholders on the latest industry trends.

“From analyzing economic data to sharing successes of our clients, we hope the Manufacturing Innovations Blog will become a site that inspires conversations about manufacturing in the U.S.,” says Roger Kilmer, director of NIST MEP.

Innovation is at the core of what MEP does. Manufacturers that accelerate innovation are far more successful than those who do not. By placing innovations developed through research at federal laboratories, educational institutions and corporations directly in the hands of U.S. manufacturers, MEP serves an essential role sustaining and growing America’s manufacturing base. The program assists manufacturers to achieving new sales, leading to higher tax receipts and new, sustainable jobs in the high-paying advanced manufacturing sector.

MEP works with U.S. manufacturers to help them create and retain jobs, increase profits, and save time and money. The nationwide network provides a variety of services, from innovation strategies to process improvements to green manufacturing. MEP also works with partners at the state and federal levels on programs that put manufacturers in position to develop new customers, expand into new markets and create new products.

Check out the blog. Leave a comment. Join the discussion.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

Solar Cell Technology Opportunities: Looking To a Bright, Sunny Future

What are the major technology challenges to future growth in the solar-cell industry? Where are the big-bang-for-the-buck R&D investment opportunities? These and other questions were put to a group of 72 internationally recognized experts in the field at a 2010 special workshop. Their conclusions are summarized in a new National Institute of Standards and Technology (NIST) publication on Photovoltaic Technologies for the 21st Century.

solar panels
Courtesy: Shutterstock/Markus Gann

The workshop was led by a steering committee chaired by Roger G. Little, CEO, Spire Corporation, and Robert W. Collins, NEG Endowed Chair of Silicate and Materials Science, University of Toledo, and co-sponsored by NIST.

Photovoltaics—the generation of electric power by direct conversion of sunlight using "solar cells"—is a rapidly growing field. Conservative estimates predict a worldwide annual photovoltaic manufacturing capacity of 200 gigawatts (GW) by 2020. For comparison, the current global generating capacity for commercial nuclear power plants is estimated to be 377 GW*. The United States currently has 8 percent of the manufacturing share of this market, but there are opportunities to double that or better, particularly through technological advances, according to the workshop report.

Workshop participants from industry, academia and government discussed the "Priority Challenges" for the four dominant photovoltaic technologies—crystalline silicon-based wafers, amorphous silicon and polycrystalline thin films, III-V multijunctions (a presently expensive but highly efficient technology that was first used in space applications), and more experimental excitonic and quantum-structured based technologies—and defined critical milestones on the path to solutions. Challenges range from reaching a better scientific understanding of the devices themselves to developing practical engineering data for determining optimal use of photovoltaics. Key questions include how can we simultaneously increase manufacturing yields, quality and performance of photovoltaic products; how can we better predict a solar cell's expected useful life and what are the connections between the properties of specific components and the performance of a final device; and how can we exploit this understanding to produce cheaper, more reliable and higher energy efficiency devices.

Recognizing that it's not all up to the researchers, the workshop also noted several institutional or policy-related issues for solar power, including increased availability of raw materials; better understanding and control of environmental impacts for the entire life-cycle of a photovoltaic installation; regulatory and tax policies that may needlessly hamper the growth of the market; and the need for better consumer information.

The new publication, Foundations for Innovation: Photovoltaic Technologies for the 21st Century, is available online at http://events.energetics.com/NISTGrandChallenges2010/pdfs/Opps_Solar_PV_web.pdf. The 32-page document summarizes for policy makers a considerably more detailed workshop report issued last year, Workshop Report: Grand Challenges for Advanced Photovoltaic Technologies and Measurements**.

* Estimate by the World Nuclear Association.
** Available at http://events.energetics.com/NISTGrandChallenges2010/pdfs/AdvPV_WorkshopReport.pdf. Additional documents from the May 2010, workshop are available at http://events.energetics.com/NISTGrandChallenges2010/downloads.html.

Media Contact: Michael Baum, baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

Qualcomm Executive Joins NIST Advisory Group

Roberto Padovani
Roberto Padovani
View hi-resolution image

Patrick Gallagher, director of the National Institute of Standards and Technology (NIST), has named Roberto Padovani of Qualcomm Incorporated to serve on the Visiting Committee on Advanced Technology (VCAT), the agency’s primary private-sector policy advisory group. Padovani—who will serve a three-year term starting on May 1, 2011—brings the body’s number to 14.

Padovani is executive vice president and chief technology officer for Qualcomm. He joined Qualcomm in 1986.

Padovani holds more than 80 patents on wireless systems. In addition, he received the Innovators in Telecommunications 2004 award from the San Diego Telecom Council; was elected to the National Academy of Engineering in 2006; and named Executive of the Year in 2006 from the School of Electrical and Computer Engineering at the University of California, San Diego. In 2009 he received the IEEE Eric. E. Sumner Award “for pioneering innovations in wireless communications, particularly to the evolution of CDMA [code division multiple access] for wireless broadband data.”

Padovani received a laureate degree from the University of Padova, Italy, and master of science and Ph.D. degrees from the University of Massachusetts, Amherst, all in electrical and computer engineering. He is an IEEE Fellow and an adjunct professor in the Electrical and Computer Engineering Department at the University of California, San Diego.

The VCAT was established by Congress in 1988 to review and make recommendations on NIST’s policies, organization, budget and programs. The VCAT chair is Vinton Cerf, vice president and chief Internet evangelist for Google. VCAT’s vice chair is Alan Taub, vice president for global research and development at General Motors.

Media Contact: Ben Stein, ben.stein@nist.gov, 301-975-3097

Comment  Comment on this article.back to top