In This Issue...
NIST Micro Sensor and Micro Fridge Make Cool Pair
Researchers at the National Institute of Standards and Technology (NIST) have combined two tiny but powerful NIST inventions on a single microchip, a cryogenic sensor and a microrefrigerator. The combination offers the possibility of cheaper, simpler and faster precision analysis of materials such as semiconductors and stardust.
As described in an upcoming issue of Applied Physics Letters,* the NIST team combined a transition-edge sensor (TES), a superconducting thin film that identifies X-ray signatures far more precisely than any other device, with a solid-state refrigerator based on a sandwich of a normal metal, an insulator and a superconductor. The combo chip, a square about a quarter inch on a side, achieved the first cooling of a fully functional detector (or any useful device) with a microrefrigerator. The paper also reports the greatest temperature reduction in a separate object by microrefrigerators: a temperature drop of 110 millikelvins (mK), or about a tenth of a degree Celsius.
TES sensors are most sensitive at about 100 mK (a tenth of a degree Celsius above absolute zero). However, these ultralow temperatures are usually reached only by bulky, complex refrigerators. Because the NIST chip can provide some of its own cooling, it can be combined easily with a much simpler refrigerator that starts at room temperature and cools down to about 300 mK, says lead scientist Joel Ullom. In this setup, the chip would provide the second stage of cooling from 300mK down to the operating temperature (100 mK).
One promising application is cheaper, simpler semiconductor defect analysis using X-rays. A small company is already commercializing an earlier version of TES technology for this purpose. In another application, astronomical telescopes are increasingly using TES arrays to take pictures of the early universe at millimeter wavelengths. Use of the NIST chips would lower the temperature and increase the speed at which these images could be made, Ullom says.
For background on how TESs and microrefrigerators work, see “Copper Ridges Nearly Double X-ray Sensor Performance” (Tech Beat, Nov. 17, 2005), and “Chip-scale Refrigerators Cool Bulk Objects” (Tech Beat, April 21, 2005).
The work was supported in part by the National Aeronautics and Space Administration.
* N.A. Miller, G.C. O’Neil, J.A. Beall, G.C. Hilton, K.D. Irwin, D.R. Schmidt, L.R. Vale and J.N. Ullom. High resolution X-ray transition-edge sensor cooled by tunnel junction refrigerators. Forthcoming in Applied Physics Letters.
Media Contact: Laura Ost, email@example.com, 303-497-4880
‘Nanodrop’ Test Tubes Created with a Flip of a Switch
Researchers at the National Institute of Standards and Technology (NIST) have demonstrated a new device that creates nanodroplet “test tubes” for studying individual proteins under conditions that mimic the crowded confines of a living cell. “By confining individual proteins in nanodroplets of water, researchers can directly observe the dynamics and structural changes of these biomolecules,” says physicist Lori Goldner, a coauthor of the paper* published in Langmuir.
Researchers recently have turned their attention to the role that crowding plays in the behavior of proteins and other biomolecules—there is not much extra space in a cell. NIST’s nanodroplets can mimic the crowded environment in cells where the proteins live while providing advantages over other techniques to confine or immobilize proteins for study that may interfere with or damage the protein. This more realistic setting can help researchers study the molecular basis of disease and supply information for developing new pharmaceuticals. For example, misfolded proteins play a role in many illnesses including Type 2 diabetes, Alzheimer’s and Parkinson’s diseases. By seeing how proteins fold in these nanodroplets, researchers may gain new insight into these ailments and may find new therapies.
The NIST nanodroplet delivery system uses tiny glass micropipettes to create tiny water droplets suspended in an oily fluid for study under a microscope. An applied pressure forces the water solution containing protein test subjects to the tip of the micropipette as it sits immersed in a small drop of oil on the microscope stage. Then, like a magician whipping a tablecloth off a table while leaving the dinnerware behind, an electronic switch causes the pipette to jerk back, leaving behind a small droplet typically less than a micrometer in diameter.
The droplet is held in place with a laser “optical tweezer,” and another laser is used to excite fluorescence from the molecule or molecules in the droplet. In one set of fluorescence experiments, explains Goldner, “The molecules seem unperturbed by their confinement—they do not stick to the walls or leave the container—important facts to know for doing nanochemistry or single-molecule biophysics.” Similar to a previous work (see “‘Micro-boxes’ of Water Used to Study Single Molecules”, Tech Beat July 20, 2006), researchers also demonstrated that single fluorescent protein molecules could be detected inside the droplets.
Fluorescence can reveal the number of molecules within the nanodroplet and can show the motion or structural changes of the confined molecule or molecules, allowing researchers to study how two or more proteins interact. By using only a few molecules and tiny amounts of reagents, the technique also minimizes the need for expensive or toxic chemicals.
* J. Tang, A.M. Jofre, G.M. Lowman, R.B. Kishore, J.E. Reiner, K. Helmerson, L.S. Goldner and M.E. Greene. Green fluorescent protein in inertially injected aqueous nanodroplets. published in Langmuir, ASAP Article, Web release date: March 27, 2008.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
Prototype Terahertz Imager Promises Biochem Advances
Researchers at the National Institute of Standards and Technology (NIST) have demonstrated a new imaging system that detects naturally occurring terahertz radiation with unprecedented sensitivity and resolution. The technology may become a new tool chemical and biochemical analyses ranging from early tumor detection to rapid and precise identification of chemical hazards for homeland security instruments.
Terahertz radiation falls between microwaves and infrared radiation on the electromagnetic spectrum, with frequencies from about 300 billion cycles per second to about 3 trillion cycles per second. Biological and chemical samples naturally emit characteristic signatures of terahertz radiation, but detecting and measuring them is a unique challenge because the signals are weak and absorbed rapidly by the atmosphere. The NIST prototype imager, described in detail for the first time in a new paper,* uses an exquisitely sensitive superconducting detector combined with microelectronics and optics technologies to operate in the terahertz range. The NIST system has its best resolution centered around a frequency of 850 gigahertz, a “transmission window” where terahertz signals can pass through the atmosphere. The system can detect temperature differences smaller than half a degree Celsius, which helps to differentiate between, for example, tumors and healthy tissue.
The heart of the system is a tiny device that measures incoming terahertz radiation by mixing it with a stable internal terahertz signal. This mixing occurs in a thin-film superconductor, which changes temperature upon the arrival of even a minute amount of radiation energy. The slight frequency difference between the two original terahertz signals produces a more easily detected microwave frequency signal.
NIST developed the device and antenna, combined with an amplifier on a chip smaller than a penny, in collaboration with the University of Massachusetts. Called a hot electon bolometer (HEB), the technology is sensitive enough to detect the weak terahertz signals naturally emitted by samples, eliminating the need to generate terahertz radiation to actively illuminate the samples. This greatly reduces complexity and minimizes safety concerns. In addition, the NIST “mixer” system delivers more information by detecting both the magnitude and phase (the point where each individual wave begins) of the radiation.
Because passively emitted signals are so weak, the current system takes about 20 minutes to make a single 40 x 40 pixel image. NIST researchers are working on an improved version that will scan faster and operate at two frequencies at once. Future systems also should be able to achieve better spatial resolution.
* E. Gerecht, D. Gu, L. You and S. Yngvesson. Passive heterodyne hot electron bolometer imager operating at 850 GHz. Forthcoming in IEEE Transactions on Microwave Theory and Techniques.
Media Contact: Laura Ost, email@example.com, 303-497-4880
NIST, Army Researchers Pave the Way for Anthrax Spore Standards
Researchers from the National Institute of Standards and Technology (NIST) and the U.S. Army Dugway (Utah) Proving Ground have developed reliable methods based on DNA analysis to assess the concentration and viability of anthrax spores after prolonged storage. The techniques and data are essential steps in developing a reliable reference standard for anthrax detection and decontamination.
Bacillus anthracis, the bacterium that causes anthrax, has been a centuries-old threat to human health. In 2001, it was used as a letter-borne terrorist weapon that killed five Americans. Since the tenacious bacterium can survive for decades in a stable spore state, the Department of Homeland Security (DHS) has been working with NIST to develop anthrax spore reference materials. These materials could be used as controls in laboratory studies of anthrax, to calibrate spore detection equipment and to assess the efficiency of spore decontamination methods.
Because sample stability is a key requirement for reference materials, NIST and Army researchers recently compared different methods for measuring the concentration, biological activity and stability of laboratory-grade Bacillus anthracis spores under different storage conditions. Bacillus anthracis (Sterne), a harmless vaccine strain, was used in the study. The results of the research will be published in an upcoming issue of the Journal of Applied Microbiology.*
Working with samples that had been stored up to 2 1/2 years, the research team used two classic microbiological techniques to quantify the Bacillus anthracis concentrations: counting spores under a microscope and counting the bacterial colonies that grow after the spores are spread on a nutrient surface and germinate. The latter yields valuable data on the biological activity of the samples; however, only viable cells are counted and counts may be underestimated if cell clumping occurs. A better approach is to measure the amount of genetic material present in the sample. This method not only measures the DNA extracted from viable anthrax spores but also DNA in solution from damaged spores, cell debris and spore fragments—giving a truer measure of the source of DNA in the samples. Additionally, many of the new instruments available for rapid detection of anthrax spores are based on DNA markers, so it is important to accurately measure the DNA content of the reference samples that will be used to test and calibrate these devices.
Traditional methods for extracting DNA from Bacillus anthracis spores are too harsh to produce material suitable for reliable measurements. To overcome this obstacle, the team developed an extraction technique that used chemicals and enzymes to disrupt intact spores into releasing their DNA in a relatively pure state.
The NIST-Army study showed that laboratory-grade Bacillus anthracis spores in suspension maintained their viability and did not clump when stored for up to 900 days. The classical method for counting spores yielded comparable results to the DNA measurements used to determine spore concentrations. The results demonstrate that research quality spores can be stored for long periods of time and still maintain their important properties, proving that uniform and consistent reference materials are possible.
* J.L. Almeida, B. Harper and K.D. Cole. Bacillus anthracis spore suspensions: determination of stability and comparison of enumeration techniques. Journal of Applied Microbiology, 2008.
Media Contact: Michael E. Newman, firstname.lastname@example.org, 301-975-3025
Carbon Nanotube Measurements: Latest in NIST ‘How-To’ Series
The National Institute of Standards and Technology (NIST), in collaboration with the National Aeronautics and Space Administration (NASA), has published detailed guidelines* for making essential measurements on samples of single-walled carbon nanotubes (SWCNTs). The new guide constitutes the current “best practices” for characterizing one of the most promising and heavily studied of the new generation of nanoscale materials.
The nanotubes are essentially cylinders of carbon atoms with a wall only one atom thick and a diameter of a couple of nanometers—but lengths up to several million times their diameter. (Think of a soup can about 100 kilometers tall.) Because of their unique electronic, thermal, optical and mechanical properties they are being studied for a wide—and expanding—range of applications, including ultrastrong fibers for nanocomposite materials, circuit elements in molecular electronics, hydrogen storage components for fuel cells and light sources for compact, efficient flat-panel displays. One basic problem is assuring the quality and purity of SWCNT materials. All known techniques for producing these tiny tubes also produce large quantities of nanojunk: simple graphite and carbon soot often encapsulating small metal particles used to catalyze the nanotube synthesis process. (See, for example, “NIST Laser-Based Method Cleans Up Grubby Nanotubes”, Tech Beat Dec. 1, 2006.)
Accurate, reliable and preferably rapid measurement techniques are needed to optimize production processes to create more product and less impurities. These will help to control cleaning and purifying processes and ultimately to improve the confidence of buyers and sellers of SWCNT materials. Beginning in 2003, NIST and NASA researchers started addressing the problem by sponsoring a series of workshops devoted to nanotube measurements. The NIST “Recommended Practice Guide” on Measurement Issues in Single Wall Carbon Nanotubes grew out of second workshop in 2005, and represents what industry, government and academic researchers regard as the most useful and accurate measurement techniques for characterizing the purity of SWCNT samples. The techniques discussed include thermogravimetric analysis; near-infrared spectroscopy; Raman spectroscopy and optical, electron and scanned probe microscopy. Researchers from the NASA Johnson Space Center, the University of California at Riverside, Boston University and the NASA Langley Research Center contributed to the guide.
The techniques described in the guide were proposed as the basis for international standards for nanotube characterization. A collaborative effort that includes the US, China, Japan, and Korea is now underway under the International Organization for Standardization (ISO) to develop these techniques into standards that will help ensure uniform characterization metrics used when buying and selling nanotubes. The editors caution that in the fast-moving field of carbon nanotubes, characterization methods will need to be updated periodically.
The NIST Recommend Practice Guides are a set of publications devoted to specific, challenging measurement issues faced in industry and research. Online copies of Measurement Issues in Single Wall Carbon Nanotubes and other guides in the series are available at The “How To Measure” Book Series.
* S. Freiman, S. Hooker, K. Migler and S. Arepalli (eds.). Measurement Issues in Single Wall Carbon Nanotubes. NIST Special Publication 960-19, March 2008.
Media Contact: Michael Baum, email@example.com, 301-975-2763
Comments Requested on Draft Earthquake Hazards Plan
Earthquakes cannot be prevented, but their impacts on life, property and the economy can and should be managed. That’s the challenge that Congress has given the National Earthquake Hazards Reduction Program (NEHRP). The federal interagency program works to reduce earthquake losses through improved design and construction techniques for new and existing buildings and lifelines, monitoring and early-warning systems, coordinated emergency preparedness plans, and public education. The National Institute of Standards and Technology (NIST), the lead agency for NEHRP, has released a draft strategic plan for public review and comment through May 9, 2008.
The final plan for 2008-2012 will guide the activities of the four NEHRP agencies—the Federal Emergency Management Agency (FEMA), the National Science Foundation (NSF), and the United States Geological Survey (USGS) and NIST—working in partnership with state and local governments, private enterprise, professional organizations and academia.
The draft plan lists nine strategic priorities for the next four years to understand earthquake phenomena, develop cost-effective measures to reduced impacts on individuals, society and construction, as well as improve rapid community recovery from earthquakes. Some of these include implementation of the Advanced National Seismic System for impact notification, deployment of response, hazard assessments and research; development of cost-effective techniques and tools to design new earthquake-resistant buildings and to improve the survivability of existing buildings; creation of realistic earthquake scenarios to help communities and businesses better understand and plan for earthquake consequences; and design of earthquake-resilient infrastructure to end vulnerabilities and possible cascading failures in critical, interconnected transportation, ports, energy , water, sewage, communications and industrial production systems.
Following the public comment period, the NEHRP agencies will jointly consider edits to the plan. The final strategic plan is to guide NEHRP agency funding decisions.
The draft NEHRP Strategic Plan and information on how to submit comments is available on the NEHRP Web site at www.nehrp.gov. The 2008 NEHRP Annual Report submitted to Congress last month is also available on the Web site.
Media Contact: John Blair, firstname.lastname@example.org, 301-975-4261
Comments Requested on Risk Management Publication
The National Institute of Standards and Technology (NIST) has released the second public draft of NIST Special Publication 800-39, Managing Risk from Information Systems: An Organizational Perspective, for comment. This is the flagship publication in a series of standards and guidelines developed by NIST that relate to the Federal Information Security Management Act.
Information technology is critical to the nation’s economy, and threats to information systems are real and costly. These dangers can come from human error, environmental problems and planned attacks. They can compromise information confidentiality, integrity and availability.
Special Publication 800-39 provides a framework for managing the risk arising from the operation and use of information systems and is built upon a common foundation of best security practices. The target audience for this publication includes agency heads, chief information officers, information system designers, developers and administrators, auditors and inspectors general.
NIST is responsible for establishing information security standards and guidelines for federal agencies that own and operate information systems processing, storing or transmitting other than classified, national security-related information. The agency also works closely with the Office of the Director of National Intelligence and the Department of Defense to develop key information security standards and guidelines for use across the federal government.
The public comment period is from April 7-30, 2008. Comments should be emailed to email@example.com.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
Technology Grants Program to Offer Webcast Series
The new Technology Innovation Program (TIP) at the National Institute of Standards and Technology is launching a series of webcasts to inform interested parties about the new grants program designed to fund high-risk, high-reward research in areas of critical national need.
In the first two of a planned series, TIP Director Marc Stanley will discuss “An Overview of TIP” on Wed., April 23, 2008, from 2:00 p.m. to 3:30 p.m. EDT, and “Areas of Critical National Need” from 2:00 p.m. to 3:30 p.m. EDT on Thurs., May 15, 2008.
More webcast topics will be announced in the near future. Interested viewers can check schedules and obtain webcast login information from the TIP Web site, www.nist.gov/tip, later this week.
There is no cost to participate in the webcasts. Participants will need access to an internet connection and a phone line to participate. For those unable to participate in the live webcast, an archived version of each event will be posted to the TIP Web site shortly after the conclusion of each webcast.
Media Contact: Michael Baum, email@example.com, 301-975-2763
Metal Detectives: New Book Details Titanic Investigation
In February 1998, Timothy Foecke, a metallurgist at the National Institute of Standards and Technology (NIST), published a rather low-key “interagency report” that cast a new and intriguing light on one of the most famous marine disasters in history, the sinking of RMS Titanic. Now in a new book, What Really Sank the Titanic, Foecke and colleague Jennifer Hooper McCarty tell the full story of their investigation into how one of the most unassuming of villains—the wrought iron hull rivets—brought about the loss of one of the world’s most advanced ocean liners and more than 1,500 passengers and crew.
While there obviously were several factors, Foecke argued in his preliminary report, the critical failure could have been the use of brittle, substandard wrought iron rivets to hold the giant ship’s hull plates together. (See “Failure of Tiny Rivets May Have Sunk ’Unsinkable’ Liner” NIST Update, Feb. 17, 1998.) The shock of the collision with an iceberg would have popped off the heads of large numbers of rivets, hull plates opening up like a zipper to let in the seawater. What Really Sank the Titanic gives the back story. Ranging from vignettes on the working conditions of Irish shipyard workers in 1912 and the business pressures on the White Star Line, the ship's owners, to the science and techniques of a modern metallurgy lab, with side trips to review contemporary testimony on the accident, Foecke and McCarty combine history and forensics to discover just what caused the ill-fated ship’s demise.
Dead link removed, May 6, 2014.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763