NIST logo

Tech Beat - January 26, 2010

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: September 8, 2010
Date Modified: September 8, 2010 
Contact: inquiries@nist.gov

Proper Vaccine Refrigeration Vital to Putting Disease on Ice

photo of inside of refrigerator

According to a NIST study, vaccines that have not been removed from their packaging (most often a cardboard box), retain their temperature longer than those that have been unpacked and placed in trays. Moreover, the standard-sized refrigerators’ ability to maintain proper temperature was unaffected by how much vaccine the researchers stored in the refrigerator, a characteristic that was not shared by the dormitory style refrigerators.

Credit: NIST
View hi-resolution image

Every year, billions of dollars worth of vaccines are shipped to thousands of medical providers across the country, and every year doctors must dispose of tens of millions of dollars worth of those vaccines because they became too warm or too cold while in storage. Now, researchers at the National Institute of Standards and Technology (NIST), with funding from and in collaboration with the Centers for Disease Control and Prevention (CDC), have completed the first of a series of tests* to determine best practices for properly storing and monitoring the temperature of refrigerated vaccines.

To ensure they are effective, most vaccines must be kept between 2 and 8 degrees Celsius from the time they are manufactured until they are administered. In addition to the cost of spoiled vaccines that must be destroyed, lack of temperature control probably has resulted in the administering of ineffective vaccinations to the public in a small, but significant, percentage of cases.

In this first phase of a larger study, NIST researchers compared standard-sized refrigerators without freezers against smaller, dormitory-style refrigerators under a variety of conditions, storage practices and use scenarios, including leaving the refrigerator door ajar for various periods, power loss and raising the ambient temperature of the room.

The NIST Thermometry group found that standard-sized, freezerless refrigerators outperformed the smaller, dormitory refrigerators by every measure, but the study also identified several good practices for vaccine storage. Among other things, the group determined that vaccines should never be kept on the door shelves because the relative lack of insulation in the door allowed unacceptable temperature drifts. Vaccines also should be kept away from the walls of the refrigerator, because the defrost cycle can cause the temperature of the walls to shift, and out of the crispers usually found at the very bottom of standard refrigerators because these areas were often shown to drop below 2 degrees Celsius.

In addition, they found that water bottles kept on the door shelves provided thermal ballast which helps to mitigate temperature rises caused by power failure, leaving the door ajar or raising the temperature of the room where the refrigerator is kept.

“While we don’t advocate any particular brand of refrigerator, we can say that the standard-sized freezerless refrigerators perform very well, but the dorm-style refrigerators do not and should not be used for storing vaccines,” says NIST physicist Gregory Strouse. “Among the many recommendations that we have made, we think one of the most positive upshots of this research is that medical clinics in most cases need not spend several thousand dollars on a pharmaceutical grade refrigerator simply for the purpose of storing vaccines.”

The NIST group plans to do further comparisons of standard-sized refrigerators with freezers and pharmaceutical grade refrigerators and begin evaluations of strategies for transporting vaccines overland. They also intend to study various styles of temperature sensors for use in shipping.

* M. Chojnacky, W. Miller, D. Ripple and G. Strouse. Thermal Analysis of Refrigeration Systems Used for Vaccine Storage (NISTIR 7656). November 2009.

Edited on Aug. 25, 2010, to delete an inaccurate reference to possible future activities.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

back to top

Engineered Metamaterials Enable Remarkably Small Antennas

In an advance that might interest Q-Branch, the gadget makers for James Bond, the National Institute of Standards and Technology (NIST) and partners from industry and academia have designed and tested experimental antennas that are highly efficient and yet a fraction of the size of standard antenna systems with comparable properties. The novel antennas may be useful in ever-shrinking and proliferating wireless systems such as emergency communications devices, micro-sensors and portable ground-penetrating radars to search for tunnels, caverns and other geophysical features.

photo of Z element

This Z antenna tested at the National Institute of Standards and Technology is smaller than a standard antenna with comparable properties. Its high efficiency is derived from the “Z element” inside the square that acts as a metamaterial, greatly boosting the signal sent over the air. The square is 30 millimeters on a side.

Credit: C. Holloway/NIST
View hi-resolution image

NIST engineers are working with scientists from the University of Arizona (Tucson) and Boeing Research & Technology (Seattle, Wash.) to design antennas incorporating metamaterials—materials engineered with novel, often microscopic, structures to produce unusual properties. The new antennas radiate as much as 95 percent of an input radio signal and yet defy normal design parameters. Standard antennas need to be at least half the size of the signal wavelength to operate efficiently; at 300 MHz, for instance, an antenna would need to be half a meter long. The experimental antennas are as small as one-fiftieth of a wavelength and could shrink further.

In their latest prototype device,* the research team used a metal wire antenna printed on a small square of copper measuring less than 65 millimeters on a side. The antenna is wired to a signal source. Mounted on the back of the square is a “Z element” that acts as a metamaterial—a Z-shaped strip of copper with an inductor (a device that stores energy magnetically) in the center (see photo).

“The purpose of an antenna is to launch energy into free space,” explains NIST engineer Christopher Holloway, “But the problem with antennas that are very small compared to the wavelength is that most of the signal just gets reflected back to the source. The metamaterial makes the antenna behave as if it were much larger than it really is, because the antenna structure stores energy and re-radiates it.” Conventional antenna designs, Holloway says, achieve a similar effect by adding bulky “matching network” components to boost efficiency, but the metamaterial system can be made much smaller. Even more intriguing, Holloway says, “these metamaterials are much more ‘frequency agile.’ It’s possible we could tune them to work at any frequency we want, on the fly,” to a degree not possible with conventional designs.

The Z antennas were designed at the University of Arizona and fabricated and partially measured at Boeing Research & Technology. The power efficiency measurements were carried out at NIST laboratories in Boulder, Colo. The ongoing research is sponsored by the Defense Advanced Research Projects Agency.

* R.W. Ziolkowski, P. Jin, J.A. Nielsen, M.H. Tanielian and C.L. Holloway. Design and experimental verification of Z antennas at UHF frequencies. IEEE Antennas Wireless Propag. Lett., 2009 vol. 8, pp. 1329-1332.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

back to top

Marine Lab Hunts Subtle Clues to Environmental Threats to Blue Crabs

The Atlantic blue crab, Callinectes sapidus, long prized as a savory meal at a summer party or seafood restaurant, is a multi-million dollar source of income for those who harvest, process and market the crustacean along the U.S. Atlantic and Gulf coasts. Unfortunately, the blue crab population has been declining in recent years under the assault of viruses, bacteria and man-made contaminants. The signs of the attack often are subtle, so researchers from the National Institute of Standards and Technology (NIST) and the College of Charleston (CofC) are at work trying to identify the clues that will finger specific, yet elusive, culprits.

photo of blue crab

A male Atlantic blue crab, Callinectes sapidus, captured from Maryland’s Chesapeake Bay.

Credit: Mary Hollinger, NOAA, 2008

Pathogens and pollutants impair the blue crab’s metabolic processes, the chemical reactions that produce energy for cells. These stresses should cause tell-tale changes in the levels of metabolites, small chemical compounds created during metabolism. Working at the Hollings Marine Laboratory (HML) in Charleston, S.C., the NIST/CofC research team is using a technology similar to magnetic resonance imaging (MRI) to identify and quantify the metabolites that increase in quantity under common environmental stresses to blue crabs—metabolites that could be used as biomarkers to identify the specific sources.

In a recent paper in Metabolomics,* the HML research team describes how it used nuclear magnetic resonance (NMR) spectroscopy to study challenges to one specific metabolic process in blue crabs: oxygen uptake. First, the researchers simulated an environmentally acquired bacterial infection by injecting crabs with the bacterium Vibrio campbellii. This pathogen impairs the crab’s ability to incorporate oxygen during metabolism. Using NMR spectroscopy to observe the impact on metabolite levels, the researchers found that the yield of glucose, considered a reliable indicator of mild oxygen starvation in crustaceans, was raised.

In a second experiment, the HML team mimicked a chemical pollutant challenge by injecting blue crabs with a chemical** known to inhibit oxidative phosphorylation, a metabolic process that manufactures energy. This time, the metabolite showing up in response to stress was lactate, the same compound seen when our muscles need energy and must take in oxygen to get more produced. A rise in the amount of lactate proved that the crabs were increasing their oxygen uptake in response to the chemical exposure.

“Having the glucose and lactate biomarkers—and the NMR spectroscopy technique to accurately detect them—is important because the blue crab’s responses to mild, non-lethal metabolic stresses are often so subtle that they can be missed by traditional analyses,” says Dan Bearden, corresponding author on the HML paper.

The research was supported in part by the National Science Foundation.

The HML is a partnership of governmental and academic agencies including NIST, NOAA’s National Ocean Service, the South Carolina Department of Natural Resources, the College of Charleston and the Medical University of South Carolina.

* T.B. Schock, D.A. Stancyk, L. Thibodeaux, K.G. Burnett, L.E. Burnett, A.F.B. Boroujerdi and D.W. Bearden. Metabolomic analysis of Atlantic blue crab, Callinectes sapidus, hemolymph following oxidative stress. Metabolomics, Published online Jan. 20, 2010, DOI 10.1007/s11306-009-0194-y.

** 2,4-dinitrophenol (DNP)

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

back to top

Stacking the Deck: Single Photons Observed at Seemingly Faster-than-Light Speeds

photon illustration
photon illustration

(Top) A single photon travels through alternating layers of low (blue) and high (green) refractive index material more slowly (top) or quickly (bottom) depending upon the order of the layers. A strategically placed additional layer (bottom) can dramatically reduce photon transit time.(Bottom) At the boundaries between layers, the photon creates waves interfering with each other, affecting its transit time.

Credit: JQI
View hi-resolution top image and bottom image.

Researchers at the Joint Quantum Institute (JQI), a collaboration of the National Institute of Standards and Technology (NIST) and the University of Maryland at College Park, can speed up photons (particles of light) to seemingly faster-than-light speeds through a stack of materials by adding a single, strategically placed layer. This experimental demonstration confirms intriguing quantum-physics predictions that light’s transit time through complex multilayered materials need not depend on thickness, as it does for simple materials such as glass, but rather on the order in which the layers are stacked. This is the first published study* of this dependence with single photons.

Strictly speaking, light always achieves its maximum speed in a vacuum, or empty space, and slows down appreciably when it travels through a material substance, such as glass or water. The same is true for light traveling through a stack of dielectric materials, which are electrically insulating and can be used to create highly reflective structures that are often used as optical coatings on mirrors or fiber optics.

In a follow up to earlier experimental measurements (see “A Sub-femtosecond Stop Watch for ‘Photon Finish’ Races”, NIST Tech Beat, March 14, 2008.), the JQI researchers created stacks of approximately 30 dielectric layers, each about 80 nanometers thick, equivalent to about a quarter of a wavelength of the light traveling through it. The layers alternated between high (H) and low (L) refractive index material, which cause light waves to bend or reflect by varying amounts. After a single photon hits the boundary between the H and L layers, it has a chance of being reflected or passing through.

When encountering a stack of 30 layers alternating between L and H, the rare photons that completely penetrate the stack pass through in about 12.84 femtoseconds (fs, quadrillionths of a second). Adding a single low-index layer to the end of this stack disproportionately increased the photon transit time by 3.52 fs to about 16.36 fs. (The transit time through this added layer would be only about 0.58 fs, if it depended only upon the layer’s thickness and refractive index.) On the contrary, adding an extra H layer to a stack of 30 layers alternating between H and L would reduce the transit time to about 5.34 fs, so that individual photons seem to emerge through the 2.6-micron-thick stack at superluminal (faster-than-light) speeds.

What the JQI researchers are seeing can be explained by the wave properties of light. In this experiment, the light begins and ends its existence acting as a particle—a photon. But when one of these photons hits a boundary between the layers of material, it creates waves at each surface, and the traveling light waves interfere with each other just as opposing ocean waves cause a riptide at the beach. With the H and L layers arranged just right, the interfering light waves combine to give rise to transmitted photons that emerge early. No faster than light speed information transfer occurs because, in actuality, it is something of an illusion: only a small proportion of photons make it through the stack, and if all the initial photons were detected, the detectors would record photons over a normal distribution of times.

* N. Borjemscaia, S.V. Polyakov, P.D. Lett and A. Migdall, Single-photon propagation through dielectric bandgaps, Optics Express, published online Jan. 21, 2010, doi:10.1364/OE.18.002279.

Media Contact: Ben Stein, bstein@nist.gov, 301-975-3097

back to top

Manufacturing Competition Challenges University Teams to Stack a Better Pallet

The National Institute of Standards and Technology (NIST) is seeking university teams to participate in a May 2010 Virtual Manufacturing Automation Competition to simulate an industrial robot performing a common but complex shop floor task—stacking odd-lot boxes on a shipping pallet. NIST, the engineering society IEEE and Georgia Tech are co-sponsoring the competition. The winning team will run part of their simulated task on a robot at the International Conference on Robotics and Automation 2010 in Anchorage, Alaska in May as part of a demonstration of seamless transition from simulation to real-world systems.

illustration of robot

Pallet-Packing Robot: This simulation shows a robot in the middle of creating a pallet of boxes of mixed sizes. NIST engineers are seeking university teams to develop such simulations for a May competition.

Credit: NIST
View hi-resolution image

“To participate, the teams need a computer gaming engine that is available for about $10,” says engineer Steve Balakirsky, adding that “from there they can use existing computer code, or create their own, to develop a simulation of a robot picking up boxes of various sizes and weights from a conveyor belt and arranging them on a pallet for shipping.” NIST’s interest, Balakirsky says, is in devising “performance metrics” that can be used to determine what makes a “good” mixed pallet.

The “mixed palletizing” task is a current manufacturing research interest because plants that produce multiple products often have to ship a variety of items to a single location. The problem is familiar to anyone who has packed a variety of gifts into a larger box for shipping. A sound solution could lead to more efficient delivery of mail, food and other wholesale items.

Building a mixed pallet is an efficient method of transporting the goods, but programming a robot for this task requires knowledge of robotics, mobility, mapping and scheduling in a manufacturing environment. Additional consideration in creating the mixed pallet are package density, pallet stability and how to know which packages need special care due to weight or fragility. The current state-of-the-art use of robots to place products on pallets involves stacking a predetermined number of boxes of the same size and weight. The challenge posed by the VMAC competition will stimulate advances in many areas of robotic algorithms, ranging from the perception of the boxes’ geometry, the grasping and positioning algorithms, and the overall planning procedures for intelligently configuring the mixed pallets based on the available boxes.

This simulation-based competition allows students to engage in real-word research that is ideal for learning robotic architectures, motion planning systems and multi-robot control, Balakirsky said.

The deadline for entering the competition is Feb. 15 and the competition is May 2-3. More information about the competition, including a simple example of a robot simulation, can be found at www.vma-competition.com. Contact NIST researchers at robosim@nist.gov with questions.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

back to top

NIST Releases Final Report on Cowboys Facility Collapse

The National Institute of Standards and Technology (NIST) has released its final report on the May 2, 2009, collapse during a severe thunderstorm of the fabric-covered, steel frame practice facility owned by the National Football League’s Dallas Cowboys. The final report is strengthened by clarifications and supplemental text based on comments provided by organizations and individuals in response to the draft report on the collapse, released for public comment on Oct. 6, 2009. The revisions did not alter the study team’s main finding: the structure collapsed under wind loads significantly less than those required under applicable design standards.

Also left unchanged after the comment period is NIST’s recommendation that other fabric-covered frame structures be evaluated to ensure adequate performance under design wind loads. These evaluations, says NIST, should determine whether or not the fabric covering provides lateral bracing for structural frames considering its susceptibility for tearing; whether the building should be considered partially enclosed or fully enclosed based on the openings that may be present around the building’s perimeter; and whether the failure of one or a few frame members may propagate, leading to a partial or total collapse of the structure.

The Cowboys facility was designed as a series of identical, tubular steel frames with a tensioned fabric covering. Assumptions and approaches used in the design of the building resulted in significant differences between the original calculated wind load demands and structural capacities compared to those derived by NIST. For instance, the NIST calculated internal wind pressure due to the presence of vents and multiple doors based on classifying the building as “partially enclosed” rather than “fully enclosed” as stated in the design documents. Also, NIST did not rely on the building’s fabric to provide lateral bracing (additional perpendicular support) to the frames in contrast to what was stated in the design documents. Finally, NIST included the effects of localized bending in calculating the expected wind resistance of the structure, whereas the design documents did not indicate that such bending was taken into account.

Based on data acquired during a reconnaissance of the collapsed facility, NIST developed a two-dimensional computer model of a typical structural frame and then analyzed that frame to study its performance under various wind conditions. NIST worked with the National Oceanic and Atmospheric Administration’s (NOAA) National Severe Storms Laboratory to estimate the wind conditions at the time of collapse. The researchers determined that, at the time of collapse, the wind was blowing perpendicular to the long side of the building. Maximum wind speed gusts at the time of collapse were estimated to be in the range of 55 to 65 miles per hour—well below the design wind speed of 90 miles per hour as specified in the national standard for wind loads.

NIST and NOAA analyzed the available wind data and concluded that a microburst (a small, intense downdraft which results in a localized area of strong winds) was centered about one mile southwest of the structure at the time of collapse. The wind field in the vicinity of the structure was predominately lateral, as assumed in design.

NIST is working with various public and private groups toward implementing changes to practice, standards, and building codes based on the findings from this study.

The complete text of the final report may be accessed at http://www.nist.gov/manuscript-publication-search.cfm?pub_id=904696.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

back to top

NIST Issues First Release of Framework for Smart Grid Interoperability

GAITHERSBURG, Md.—The Commerce Department’s National Institute of Standards and Technology (NIST) issued today an initial list of standards, a preliminary cyber security strategy, and other elements of a framework to support transforming the nation’s aging electric power system into an interoperable Smart Grid, a key component of the Obama administration’s energy plan and its strategy for American innovation.

NIST Director Patrick Gallagher announced the publication of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 1.0, to the some 700 engineers, scientists, and business and government executives attending the IEEE Innovative Smart Technologies Conference, which NIST is hosting.

The Energy Independence and Security Act of 2007 (EISA) set development of the Smart Grid as a national policy goal, and it assigned NIST the “primary responsibility to coordinate development of a framework that includes protocols and model standards for information management to achieve interoperability of Smart Grid devices and systems …”

“This is an important milestone for NIST, for the entire community of Smart Grid stakeholders, and for the nation,” Gallagher said. “This first installment of the Smart Grid interoperability framework will pay dividends to our nation for decades to come. Just as Congress intended, we are building a foundation for sustainable growth and future prosperity.”

By integrating digital computing and communication technologies and services with the power-delivery infrastructure, the Smart Grid will enable bidirectional flows of energy and two-way communication and control capabilities. A range of new applications and capabilities will result. Anticipated benefits range from real-time consumer control over energy usage to significantly increased reliance on solar and other sources of clean renewable energy to greatly improve reliability, flexibility and efficiency of the entire grid.

The new report presents the first release of a Smart Grid interoperability framework and roadmap for its further development. It contains:

  • a conceptual reference model to facilitate design of an architecture for the Smart Grid overall and for its networked domains;
  • an initial set of 75 standards identified as applicable to the Smart Grid;
  • priorities for additional standards—revised or new—to resolve important gaps;
  • action plans under which designated standards-setting organizations will address these priorities; and
  • an initial Smart Grid cyber security strategy and associated requirements.


A draft of today’s report was issued on Sept. 24, 2009, for public review and comments. More than 80 individuals and organizations submitted comments. A companion draft document, NISTIR 7628, Smart Grid Cyber Security Strategy and Requirements, also underwent public review. A subsequent draft of the cyber security strategy, which will include responses to comments received and will incorporate new information prepared by the almost 300-member cyber security working group, will be issued in February. NIST intends to finalize the Smart Grid cyber security in late spring.

Under EISA, the Federal Energy Regulatory Commission (FERC) is charged with instituting rulemaking proceedings, and once sufficient consensus is achieved, adopting the standards and protocols necessary to ensure Smart Grid functionality and interoperability in interstate transmission of electric power and in regional and wholesale electricity markets. However, some of the standards listed in the NIST report are still under development and some others, such as those already used voluntarily by industry, may not warrant adoption by FERC or other regulators.

“NIST is working closely with FERC and state utility regulators so that we can coordinate development of additional technical information on individual standards to support their evaluation and potential use for regulatory purposes,” said George Arnold, NIST’s national coordinator for Smart Grid interoperability.

In November 2009, NIST launched a Smart Grid Interoperability Panel (SGIP) to assist NIST in carrying out its EISA-assigned responsibility, including working with regulatory bodies on evaluating and implementing standards in this and subsequent releases of the NIST interoperability framework.

A public-private partnership, the SGIP is designed to provide “a more permanent process” to support the evolution of the interoperability framework and further development of standards, according to the report. With NIST, the report explains, the panel will “identify and address additional gaps, assess changes in technology and associated requirements for standards, and provide ongoing coordination” of standards organizations’ efforts to support timely availability of needed Smart Grid standards.

Over the past two months, almost 500 organizations have joined the SGIP. A total of 1,350 individuals from membership organizations have signed up to participate in the panel’s technical activities.

A copy of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 1.0, can be downloaded here: http://www.nist.gov/public_affairs/releases/smartgrid_interoperability_final.pdf

Comments on the draft report can be found here: http://collaborate.nist.gov/twiki-sggrid/bin/view/SmartGrid/IKBFramework

To learn more about the SGIP, go to: http://collaborate.nist.gov/twiki-sggrid/bin/view/SmartGrid.SGIP

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

back to top

Quicklinks

Baldrige Conference Provides Path for Those on ‘Quest for Excellence’

Learn about the exceptional performance management practices and results of the 2009 recipients of the Malcolm Baldrige National Quality Award at the Quest for Excellence XXII, April 12-14, 2010, at the Hilton Washington in Washington, D.C. The 2009 Baldrige Award recipients —listed with their category—are: Honeywell Federal Manufacturing & Technologies, Kansas City, Mo. (manufacturing); MidwayUSA, Columbia, Mo. (small business); AtlantiCare, Egg Harbor Township, N.J. (health care); Heartland Health, St. Joseph, Mo. (health care); and VA Cooperative Studies Program Clinical Research Pharmacy Coordinating Center, Albuquerque, N.M. (nonprofit).

Throughout the three-day conference, senior leaders and others from the 2009 Baldrige Award recipients and former recipients will present and answer questions on their processes, tools and results in areas such as leadership, strategic planning, and customer and employee engagement. Two pre-conference workshops on April 11 will help attendees better understand how to use the Baldrige criteria for innovation and performance excellence as a tool to assess and improve their organization.

Learn more about the Quest for Excellence®.

Media Contact: Michael E. Newman, michael.baum@nist.gov, 301-975-2763

back to top

BLOGRIGE: Baldrige Program Blog Now Seeking Contributors!

BLOGRIGE--the new official blog of the Baldrige National Quality Program--is now available online. The blog is designed as a place where readers with a passion for organizational innovation, improvement and performance excellence can share ideas, discuss lessons learned and relate best practices. The blog will feature commentary and observations from Baldrige Program staff, community leaders and Baldrige Award recipient organizations. Users also can sign up to receive automatic e-mail alerts when a new post has been added. Go to http://nistbaldrige.blogs.govdelivery.com/.

Media Contact: Michael Baum, baum@nist.gov, 301-975-2763

back to top