Knowing When Poultry Goes Foul
Mom's trusty nose may be good, but researchers at the National Institute of Standards and Technology (NIST) have gone her one better by designing an instrument that quickly and precisely sniffs trace amounts of chemical compounds that indicate poultry spoilage without damaging the product itself.* The process can detect minute amounts of spoilage compounds and can be used by suppliers during all stages of processing, transport and storage.
Several proactive measures are used in the United States to keep poultry from going bad between the time it leaves the farm to when it reaches your grocery cart. Antibiotics and other chemical additives are commonly used to keep the product from spoiling, but without invasive and time-consuming tests, it's hard to determine if the spoilage process has begun or not.
For several years, detection of volatile organic compounds created when lipids and/or proteins decompose has been used to test for spoilage. The technique developed by NIST research chemists Tom Bruno and Tara Lovestead relies on identifying the much more difficult to detect trace amounts of low volatility compounds that are present early in the decay process. Analyzing such low-volatility compounds used to require impractically long collection times to get a big enough sample for testing and identification.
The key to detecting minute levels of the low volatility compounds produced when chicken spoils is a new method of sampling the "headspace" —the air above a test sample. Bruno devised a technique using a short alumina-coated tube cooled to very low temperatures to promote the adsoption of low-volatility chemicals, a technique called cryoadsorption. (See "Prototype NIST Method Detects and Measures Elusive Hazards," NIST Tech Beat, Sept. 8, 2009, at www.nist.gov/public_affairs/techbeat/tb2009_0908.htm#explosives.) Among other advantages, Bruno's sampling method is robust and flexible in terms of where and how it can be used, an important feature for the food industry.
Bruno and Lovestead separated and identified six potential chemical markers that could be used to indicate poultry spoilage before it becomes unhealthy. Those markers were found in the air above spoiled chicken breasts, maintained in their original retail packaging and kept at room temperature for two weeks.
Considering that Americans annually consume an average of nearly 84 pounds of chicken each (per 2008 USDA statistics, the most recent year available), this improved testing method for spoilage could have significant health implications.
* T.Bruno and T. Lovestead. Detection of poultry spoilage markers from headspace analysis with cryoadsorption on a short alumina PLOT column. Food Chemistry, Volume 121, Issue 4, Aug. 15, 2010, pages 1274-1282.
Media Contact: James Burrus, email@example.com, 303-497-4789
NIST Detector Counts Photons With 99 Percent Efficiency
Scientists at the National Institute of Standards and Technology (NIST) have developed* the world's most efficient single photon detector, which is able to count individual particles of light traveling through fiber optic cables with roughly 99 percent efficiency. The team's efforts could bring improvements to secure electronic communication, advanced quantum computation and the measurement of optical power.
Using essentially the same technology that permitted them to achieve 88 percent detection efficiency five years ago,** the team has enhanced its ability to detect photons largely by improving the alignment of the detector and the optical fibers that guide photons into it. The basic principle of the detector is to use a superconductor as an ultra-sensitive thermometer. Each individual photon hitting the detector raises the temperature—and increases electrical resistance—by a minute amount, which the instrument registers as the presence of a photon.
According to team member Sae Woo Nam, the advantage of this type of single photon detector is that the new detector design not only measures lower levels of light than have ever been possible, but does so with great accuracy.
“When these detectors indicate they’ve spotted a photon, they’re trustworthy. They don’t give false positives,” says Nam, a physicist with NIST’s Optoelectronics division. “Other types of detectors have really high gain so they can measure a single photon, but their noise levels are such that occasionally a noise glitch is mistakenly identified as a photon. This causes an error in the measurement. Reducing these errors is really important for those who are doing calculations or communications.”
The ability to count individual photons is valuable to designers of certain types of quantum computers as well as scientists engaged in quantum optical experiments, which concern exotic states of light that cannot be described by classical physics. But one of the most promising potential applications of a high-efficiency photon detector is a way to secure long-distance data transmission against unwanted interception. A detector that could recognize that a photon forming part of a transmission was missing would be a substantial defense against information theft.
The team has optimized the detection for 810 nanometers—an infrared wavelength—and it still has high efficiency at other wavelengths that are interesting for fiber optic communications, as well as the quantum optics community. Ironically, the detector is so efficient that it outstrips current technology’s ability to determine its precise efficiency.
“We can’t be sure from direct measurement that we’ve achieved 99 percent efficiency because the metrology is not in place to determine how close we are—there’s no well-established technique,” Nam says. “What is great about our latest progress is that we measure nearly the same detection efficiency for every device we build, package and test. It’s the reproducibility that gives us confidence.”
The team is currently working to develop evaluation techniques that can measure up to the detector’s abilities, and Nam says the team’s creation could also help evaluate other light-gathering devices.
“NIST offers a standardized service for measuring the efficiency of photodetectors and optical power meters,” he says. “We’re trying to develop a calibration technique that extends to ultra-low levels of light. It should be valuable for anyone looking at single photons.”
* A.E. Lita, B. Calkins, L.A. Pellouchoud, A.J. Miller and S. Nam. Superconducting transition-edge sensors optimized for high-efficiency photon-number resolving detectors. Presented at the SPIE Symposium on SPIE Defense, Security, and Sensing, Orlando World Center Marriott Resort and Convention Center, Crystal J1 Ballroom, 3 p.m. April 7, 2010.
** See “NIST Photon Detectors Have Record Efficiency” in NIST Tech Beat, June 2, 2005, at www.nist.gov/public_affairs/techbeat/tb2005_0602.htm#photon.
Media Contact: Chad Boutin, firstname.lastname@example.org, 301-975-4261
New MEP Advisory Board White Paper Assesses the Present and Future of American Manufacturing
A new white paper prepared by the Hollings Manufacturing Extension Partnership (MEP) Board discusses the state of domestic manufacturing and the characteristics of good manufacturers, and plots a course to improve the competitiveness of manufacturing in the United States. The MEP is managed by the National Institute of Standards and Technology (NIST). The Manufacturing Extension Partnership Advisory Board (MEPAB) is an external advisory body created to provide guidance and advice on the MEP program from the perspective of industrial extension customers and providers who have a vision of industrial extension with a national scope.
According to Board Chairman Ned Hill, President for Economic Development at Cleveland State University, there have been a number of public policy reviews of U.S. manufacturing, each with a particular point of view, and nearly all advocating a narrowly defined “silver bullet” policy intervention. The MEPAB report finds that there are reasons for concern about the industry’s future, but there are also reasons for optimism.
Resolving the competitive disadvantages that U.S. manufacturers face is similarly nuanced. Although many observers have pointed to innovation as the key characteristic of successful companies, the report finds that innovation alone is not enough. To be meaningful, innovation must result in new products, new production processes, or new management practices. Manufacturers also need to be green, care about their workforce and develop their in-house talent, and find their niche in the global marketplace.
U.S. manufacturers also are at a disadvantage because of the lack of a national manufacturing policy, according to Hill. What policies the U.S. does have were created ad hoc to deal with specific emergency situations, he says. Other reports have placed the burden of developing a national manufacturing strategy solely on the federal government. The MEPAB report suggests that manufacturers, the government and academia should all be involved in developing national manufacturing policies as well as providing a supporting implementation infrastructure for U.S. manufacturing.
Among the suggestions that the Board makes regarding the possible shape such a national policy might take includes developing metrics to measure the return on investments in R&D and federal laboratories. The group also recommends rewarding those institutions that actively seek out opportunities for translating and transferring the products of their research into commercial technologies.
To download the MEP report, Innovation and Product Development in the 21st Century, go to www.nist.gov/mep/upload/MEP_advisory_report_4_24l.pdf.
Media Contact: Mark Esser, email@example.com, 301-975-8735
FISMA Project Manager Ron Ross Named to 2010 Federal 100 List
The National Institute of Standards and Technology's (NIST) Ron Ross has received the 2010 Federal 100 Award. Presented by Federal Computer Week, the award honors the top professionals in the federal information technology community. This is the third time he has been named to the list.
A select panel of government and industry leaders chooses the Federal 100 winners from nominations from the public and private sector.
Ross is a senior computer scientist and information security researcher at NIST. He manages the Federal Information Security Management Act (FISMA) Implementation Project, which includes the development of key security standards and guidelines for the federal government and support contractors. These standards and guidelines have been widely adopted nationally and internationally. Ross is also the principal architect of the NIST Risk Management Framework, which provides a disciplined and structured methodology for integrating the suite of security standards and guidelines into a comprehensive, risk-based, enterprise-wide information security program. One of the most significant characteristics of the framework is that it helps organizations to adopt continuous monitoring practices, which allow organizations to dynamically assess their current security posture and react quickly and effectively against new threats and vulnerabilities. The framework has fundamentally changed how agencies protect their information and information systems.
In addition to these responsibilities, Ross leads the Joint Task Force Transformation Initiative Working Group, a joint partnership with NIST, the Department of Defense, and the Intelligence Community to develop a unified information security framework for the federal government and its support contractors.
“Ross has provided extraordinary research and technical leadership in the field of information security and the unification of information security concept and practices in the federal government,” says Matthew Scholl, manager of NIST’s Security Management and Assurance Group.
For more information on NIST Agency Security Standards and Guidelines, see http://csrc.nist.gov/groups/SMA/fisma/index.html; for a listing of other Federal 100 winners, see http://fcw.com/pages/2010-federal-100-list.aspx.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763