NIST logo

Tech Beat - November 3, 2009

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: December 28, 2010
Date Modified: December 28, 2010 
Contact: inquiries@nist.gov

Capturing Those In-Between Moments: NIST Solves Timing Problem in Molecular Modeling

A theoretical physicist at the National Institute of Standards and Technology (NIST) has developed a method for calculating the motions and forces of thousands of atoms simultaneously over a wider range of time scales than previously possible. The method overcomes a longstanding timing gap in modeling nanometer-scale materials and many other physical, chemical and biological systems at atomic and molecular levels.

colorized simulation

Colorized simulation of what happens to 1100 carbon atoms in a ‘flat’ sheet of graphene about 20 microseconds after the central atom is moved slightly upwards. Darker violet colors indicate atoms that have dropped below their original position, whereas the lighter green colors show where atoms have risen.

Click here to see a video clip (6.5 MB AVI file) that shows how ripples propagate in a sheet of graphene after the central atom is moved slightly upwards. The clip is a slow-motion version of action that occurs over about 45 microseconds.

Credit: V.K. Tewary/NIST
View hi-resolution image

The new mathematical technique* can significantly improve modeling of atomic-scale processes that unfold over time, such as vibrations in a crystal. Conventional molecular dynamics (MD) techniques can accurately model processes that occur in increments measured in picoseconds to femtoseconds (trillionths to quadrillionths of a second). Other techniques can be used over longer periods to model bulk materials but not at the molecular level. The new NIST technique can access these longer time scales—in the critical range from nanoseconds to microseconds (billionths to millionths of a second)—at the molecular level. Scientists can now measure and understand what happens at key points in time that were not previously accessible, and throughout the full spectrum of time scales of interest in MD, says developer Vinod Tewary.

Modeling of material properties and physical processes is a valuable aid and supplement to theoretical and experimental studies, in part because experiments are very difficult at the nanoscale. MD calculations are usually based on the physics of individual atoms or molecules. This traditional approach is limited not only by time scale, but also by system size. It cannot be extended to processes involving thousands of atoms or more because today’s computers—even supercomputers—cannot handle the billions of time steps required, Tewary says. By contrast, his new method incorporates a “Green’s function,” a mathematical approach that can calculate the condition of a very large system over flexible time scales in a single step. Thus, it overcomes the system size problem as well as the timing gap.

Tewary illustrated the new technique on two problems. He showed how a pulse propagating through a string of atoms, initiated by moving the middle atom, could be modeled for just a few femtoseconds with conventional MD, whereas the NIST method works for several microseconds. Tewary also calculated how ripples propagate in 1,100 carbon atoms in a sheet of graphene over periods up to about 45 microseconds, a problem that could not be solved previously. Normally thought of as a static flat sheet, the atoms in graphene actually must undulate somehow to remain stable, and the new technique shows how these ripples propagate. (See accompanying image and movie). Consisting entirely of carbon atoms, graphene is a recently discovered honeycomb crystal material that may be an outstanding conductor for wires and other components in nanoscale electronics.

The new NIST technique is expected to enable modeling of many other processes that occur at time scales of nano- to microseconds, such as formation and growth of defects, conduction of heat, diffusion and radiation damage in materials. The technique could improve results in many different fields, from modeling of new nanotechnologies in the design stage to simulating the radiation damage from a “dirty bomb” over time.

NIST researchers plan to write a software program encoding the new technique to make it available to other users.

* V.K. Tewary. Extending time scale in molecular dynamics simulations: propagation of ripples in graphene. Physical Review B, Vol. 80, No. 16.. Published online Oct. 22, 2009.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

back to top

NIST Quantifies Low Levels of 'Heart Attack Risk' Protein

Searching for a needle in a haystack may seem futile, but it's worth it if the needle is a hard-to-detect protein that may identify a person at high risk of a heart attack circulating within a haystack of human serum (liquid component of blood).

Computer-generated image of the structure for C-reactive protein

Computer-generated image of the structure for C-reactive protein

Credit: S. Kolstoe, Center for Amyloidosis, University College, London, UK

C-reactive protein (CRP), a molecule produced by the liver in response to inflammation, normally accounts for less than 1/60,000 of a person's total serum protein, or about 1 milligram per liter (mg/L) of serum. Recent evidence suggests that a CRP level between 1 and 3 mg/L indicates a moderate risk of cardiovascular disease while a level greater than 3 mg/L predicts a high risk. A clinical diagnostic procedure known as the high-sensitivity CRP (hsCRP) test has been used to detect higher-than-normal levels of the protein and warn a patient about elevated risk for cardiovascular disease. However, there is no certified reference material—in this case, a sample of human serum with accurately determined amounts of the CRP for various risk levels—against which the accuracy of methods for measuring CRP can be evaluated. The problem: normal, low-risk of cardiovascular disease CRP levels are so low that even mass spectrometry (a very sensitive technique for separating and identifying molecules based on mass) cannot easily quantify them.

In a recent paper in Analytical Chemistry,* NIST researchers Eric Kilpatrick and David Bunk describe the first steps toward development of a certified reference material that can be used to assess the accuracy of routine clinical laboratory tests for CRP. The researchers accomplished this by isolating the minute amounts (less than 1 mg/L) of CRP circulating at normal levels in serum prior to measurement. Using a protein isolation technique called affinity purification, Kilpatrick and Bunk added polystyrene beads coated with anti-CRP antibodies to normal human serum. The antibodies bind tightly to any circulating CRP, allowing it to be easily removed from solution. The researchers then cleave the purified protein they isolated into its component parts, known as peptides, using enzyme digestion. The peptides are more readily measured by the mass spectrometer, resulting in a very precise determination of the total CRP.

To see if their purification method yields CRP that can serve as a reference material, Kilpatrick and Bunk will next mix purified CRP with genetically engineered CRP containing a heavy isotope of nitrogen (15N) and then run the combined pool through affinity purification, enzyme digestion and mass spectrometry. The peptides with the heavy 15N atoms will be easily detected and precisely quantified by the mass spectrometer. If the measurements for the 15N-tagged peptides compare favorably to those made for the purified serum CRP, then that will validate the use of the affinity purification method for quantifying extremely low levels of the protein. In turn, this validation will clear the way for purified serum CRP derived by the NIST method to be eventually used as a quality control and calibration tool by manufacturers for the hsCRP test.

* E.L. Kilpatrick and D.M. Bunk. Reference measurement procedure development for C-reactive protein in human serum. Analytical Chemistry, Vol. 81, No. 20, pages 8610-8616. Oct. 15, 2009

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

back to top

Scientists Build First ‘Frequency Comb’ To Display Visible ‘Teeth’

Finally, an optical frequency comb that visibly lives up to its name.

Scientists at the University of Konstanz in Germany and the National Institute of Standards and Technology (NIST) in the United States have built the first optical frequency comb—a tool for precisely measuring different frequencies of visible light—that actually looks like a comb.

frequency comb

Photographs of four different regions of the new optical frequency comb. The light is filtered through a grating spectrometer and photographed with a digital camera through a microscope. Each visible line or "tooth" is an individual frequency in the comb, which spans the visible spectrum from red to blue. More than 1,500 such photos would need to be lined up to show the entire comb.

Credit: S. Diddams/NIST

View high-resolution version

As described in the Oct. 30 issue of Science,* the "teeth” of the new frequency comb are separated enough that when viewed with a simple optical system—a grating and microscope—the human eye can see each of the approximately 50,000 teeth spanning the visible color spectrum from red to blue. A frequency comb with such well-separated, visibly distinct teeth will be an important tool for a wide range of applications in astronomy, communications and many other areas.

A basis for the 2005 Nobel Prize in physics, frequency combs are now commonplace in research laboratories and next-generation atomic clocks. But until now, comb teeth have been so closely spaced that they were distinguishable only with specialized equipment and great effort, and the light never looked like the evenly striped pattern of the namesake comb to the human eye.

Each tooth of the comb is a different frequency, or color (although the human eye can’t distinguish the very small color differences between nearby teeth). A frequency comb can be used like a ruler to measure the light emitted by lasers, atoms, stars or other objects with extraordinarily high precision. Other frequency combs with finer spacing are highly useful tools, but the new comb with more visibly separated teeth will be more effective in many applications such as calibrating astronomical instruments. The new comb is produced by a dime-sized laser that generates super-fast, super-short pulses of high-power light containing tens of thousands of different frequencies. As in any frequency comb, the properties of the light over time are converted to tick marks or teeth, with each tooth representing a progressively higher number of oscillations of light waves per unit of time. The shorter the pulses of laser light, the broader the range of frequencies produced. In the new comb described in Science, the laser pulses are even shorter and repeated 10 to 100 times faster than in typical frequency combs. The laser emits 10 billion pulses per second, with each pulse lasting about 40 femtoseconds, or quadrillionths of a second, producing extra-wide spacing between individual comb teeth.

Another unusual feature of the new comb is efficient coupling of the laser pulses into a “nonlinear” optical fiber, which dramatically expands the spectrum of frequencies in the comb. Since details of the unusually powerful dime-sized laser were first published in 2008, scientists have doubled the average pulse power directed into the fiber, enabling the comb to reach blue colors for the first time, producing a spectrum across a range of wavelengths from 470 to 1130 nanometers, from blue to infrared. The 50,000 individual colors become visible when the light emitted from the fiber is filtered through a grating spectrometer, a common laboratory instrument that acts like a souped-up prism.

The broad spectrum spanned by the comb—unusual for such a fast pulse rate—enables all the frequencies to be stabilized, using a NIST-developed technique that directly links optical and radio frequencies. Stabilization is crucial for applications.

The ability to directly observe and use individual comb teeth will open up important applications in astronomy, studies of interactions between light and matter, and precision control of high-speed optical and microwave signals for communications, according to the paper. NIST scientists previously have shown, for example, that this type of frequency comb could boost the sensitivity of astronomical tools searching for other Earthlike planets as much as a hundredfold. In addition, the new comb could be useful in a NIST project to develop optical signal-processing techniques, which could dramatically expand the capabilities of communications, surveillance, optical pattern recognition, remote sensing and high-speed computing technologies.

The laser was built by Albrecht Bartels at the Center for Applied Photonics of the University of Konstanz. The frequency comb was built and demonstrated in the lab of NIST physicist Scott Diddams in Boulder, Colo.

As a non-regulatory agency of the U.S. Department of Commerce, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life.

* A. Bartels, D. Heinecke, and S.A. Diddams. 10 GHz Self-referenced Optical Frequency Comb. Science. Oct. 30, 2009.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

back to top

NIST Test Proves 'The Eyes Have It' for ID Verification

The eyes may be the mirror to the soul, but the iris reveals a person's true identity--its intricate structure constitutes a powerful biometric. A new report by computer scientists at the National Institute of Standards and Technology (NIST) demonstrates that iris recognition algorithms can maintain their accuracy and interoperability with compact images, affirming their potential for large-scale identity management applications such as the federal Personal Identity Verification program, cyber security and counterterrorism.

eyes

These compressed iris images from the IREX I test illustrate why the JPEG format did not meet the test criteria. The JPEG 2000 format (top) retains its quality after compression for storage and transmission, while the JPEG format becomes pixilated when reduced the same amount.

Credit: NIST

After fingerprints, iris recognition has emerged in recent years as the second most widely supported biometric characteristic. This marketplace rests, in large part, on the ability of recognition algorithms to process standard images from the many cameras now available. This requires images to be captured in a standard format and prepared so that they are compact enough for a smart card and for transmission across global networks. The images also have to be identifiable by computer algorithms and interoperable with any iris-matcher product regardless of the manufacturer.

NIST scientists are working with the international biometrics community to revise iris recognition standards and to advance iris images as the global interchange medium in this rapidly evolving field.

NIST established the Iris Exchange IREX program as a NIST-industry collaboration to encourage development of iris recognition algorithms operating on images conforming to the new ISO-IEC 19794-6 standard. The first IREX project, IREX I, provided quantitative support to the standard by conducting the largest independently administered test of iris recognition technology to date. The test attracted 19 recognition technologies from 10 different providers. This represents an order of magnitude expansion of the industry over the past five years.

The international standard, now under revision, defined three competing image formats and three compression methods: the IREX I test narrowed the field by determining which ones performed consistently at a high level and are included in the IREX report. The image format test showed that two of the three formats performed well: these center and crop the iris, or center, crop and mask eyelids and eyelashes. The study also determined that two compression standards were found to squeeze the images to a size small enough for storage and transmission while retaining the necessary quality level. One is the JPEG2000 which gives better recognition accuracy than the more commonly used JPEG, and the other is PNG format that employs lossless compression to completely preserve the iris information.

The IREX I tests also looked at technical factors affecting users. These include speed-accuracy tradeoffs, threshold calibration, storage requirements, image quality assessment, and the effects of iris size, eyelid occlusion and pupil dilation. The test result shows that forensic applications, where image quality is sometimes degraded, can benefit from slower but more powerful algorithms.

Recommendations based on the NIST results have been adopted by the standards committees. The report, IREX I: Performance of Iris Recognition Algorithms on Standard Images, can be downloaded from http://iris.nist.gov/irex. Since its inception in 2007, IREX has helped advance iris recognition toward the level of technical maturity and interoperability of fingerprint biometrics and has affirmed the potential for using iris biometrics as a second modality for large-scale identity management applications.

Meanwhile, plans for IREX II are under way to calibrate and evaluate the effectiveness and efficiency of iris image quality assessment algorithms. This study will support a new international iris image quality standard by identifying specific iris image properties that are influential on recognition accuracy. The second draft of the IREX II research plan—available online at http://iris.nist.gov/irexII—is open for comments until Nov. 15, 2009. Comments should be submitted to irex@nist.gov.

Funding for IREX is provided by both the Department of Homeland Security's Office of US-VISIT and its Science and Technology Directorate.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

back to top

Quicklinks

NIST, CU to Build Instrument to Help Search for Earth-like Planets

The National Institute of Standards and Technology (NIST) will collaborate with the University of Colorado at Boulder (CU) to build and apply a custom laser-based instrument--a frequency comb--to help search for extrasolar Earth-like planets. The National Science Foundation has awarded CU a $495,000 grant for the joint project.

Frequency combs are tools for precisely measuring different frequencies of light (see the background document “Optical Frequency Combs” for more information). NIST physicist Scott Diddams, together with CU astronomer Steve Osterman and other colleagues, will design and build an unusual comb with “teeth” (individual frequencies) that are widely spaced enough for astronomical instruments to read. The new comb will be used to calibrate measurements of subtle changes in infrared starlight caused by a star wobbling from the gravitational pull of an orbiting planet. Frequency combs could be such superior calibration tools that they would make it possible to detect even tiny Earthlike planets that cause color shifts equivalent to a star wobble of just a few centimeters per second. Current astronomical instruments can detect—at best—a wobble of about 1 meter per second.

The Boulder researchers plan to take the new laser instrument to the Apache Point Observatory northeast of Las Cruces, N.M., in the spring of 2010 to integrate it with a new planet-finding experiment. For more, see the CU news release “Nobel Prize-Winning Science From Boulder Serves as Springboard for Planet Hunting.”

Media Contact: Michael Baum, baum@nist.gov, 301-975-2763

back to top