NIST logo

Tech Beat - May 2, 2012

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: May 2, 2012
Date Modified: May 2, 2012 
Contact: inquiries@nist.gov

NIST Physicists Benchmark Quantum Simulator with Hundreds of Qubits

Physicists at the National Institute of Standards and Technology (NIST) have built a quantum simulator that can engineer interactions among hundreds of quantum bits (qubits)—10 times more than previous devices. As described in the April 26 issue of Nature*, the simulator has passed a series of important benchmarking tests and scientists are poised to study problems in material science that are impossible to model on conventional computers.

Many important problems in physics—especially low-temperature physics—remain poorly understood because the underlying quantum mechanics is vastly complex. Conventional computers—even supercomputers—are inadequate for simulating quantum systems with as few as 30 particles. Better computational tools are needed to understand and rationally design materials, such as high-temperature superconductors, whose properties are believed to depend on the collective quantum behavior of hundreds of particles.

The NIST simulator consists of a tiny, single-plane crystal of hundreds of beryllium ions, less than 1 millimeter in diameter, hovering inside a device called a Penning trap. The outermost electron of each ion acts as a tiny quantum magnet and is used as a qubit—the quantum equivalent of a “1” or a “0” in a conventional computer. In the benchmarking experiment, physicists used laser beams to cool the ions to near absolute zero. Carefully timed microwave and laser pulses then caused the qubits to interact, mimicking the quantum behavior of materials otherwise very difficult to study in the laboratory. Although the two systems may outwardly appear dissimilar, their behavior is engineered to be mathematically identical. In this way, simulators allow researchers to vary parameters that couldn’t be changed in natural solids, such as atomic lattice spacing and geometry. In the NIST benchmarking experiments, the strength of the interactions was intentionally weak so that the simulation remained simple enough to be confirmed by a classical computer. Ongoing research uses much stronger interactions.

ion crystal schematic
The NIST quantum simulator permits study of quantum systems that are difficult to study in the laboratory and impossible to model with a supercomputer. The heart of the simulator is a two-dimensional crystal of beryllium ions (blue spheres in the graphic); the outermost electron of each ion is a quantum bit (qubit, red arrows). The ions are confined by a large magnetic field in a device called a Penning trap (not shown). Inside the trap the crystal rotates clockwise.
Credit: Britton/NIST
View hi-resolution image

Simulators exploit a property of quantum mechanics called superposition, wherein a quantum particle is made to be in two distinct states at the same time, for example, aligned and anti-aligned with an external magnetic field. So the number of states simultaneously available to 3 qubits, for example, is 8 and this number grows exponentially with the number of qubits: 2N states for N qubits.

Crucially, the NIST simulator also can engineer a second quantum property called entanglement between the qubits, so that even physically well separated particles may be made tightly interconnected.

ion crystal top view
The NIST quantum simulator permits study of quantum systems that are difficult to study in the laboratory and impossible to model with a supercomputer. In this photograph of the crystal, the ions are fluorescing, indicating the qubits are all in the same state. Under the right experimental conditions, the ion crystal spontaneously forms this nearly perfect triangular lattice structure.
Credit: Britton/NIST
View hi-resolution image

Recent years have seen tremendous interest in quantum simulation; scientists worldwide are striving to build small-scale demonstrations. However, these experiments have yet to fully involve more than 30 quantum particles, the threshold at which calculations become impossible on conventional computers. In contrast, the NIST simulator has extensive control over hundreds of qubits. This order of magnitude increase in qubit-number increases the simulator’s quantum state space exponentially. Just writing down on paper a state of a 350-qubit quantum simulator is impossible—it would require more than a googol of digits: 10 to the power of 100.

Over the past decade, the same NIST research group has conducted record-setting experiments in quantum computing,** atomic clocks and, now, quantum simulation. In contrast with quantum computers, which are universal devices that someday may solve a wide variety of computational problems, simulators are “special purpose” devices designed to provide insight about specific problems.

This work was supported in part by the Defense Advanced Research Projects Agency. Co-authors from Georgetown University, North Carolina State University and in South Africa and Australia contributed to the research.

As a nonregulatory agency of the U.S. Department of Commerce, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life.

* J.W. Britton, B.C. Sawyer, A. Keith, C.-C. J. Wang, J.K. Freericks, H. Uys, M. J. Biercuk and J.J. Bollinger. Engineered 2D Ising interactions on a trapped-ion quantum simulator with hundreds of spins. Nature. (In press) doi:10.1038/nature10981.
** See the NIST 2009 news announcement, “NIST Develops Powerful Method of Suppressing Errors in Many Types of Quantum Computers,” at www.nist.gov/pml/div688/quantum_042209.cfm.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

Researchers Find Reducing Fishmeal Hinders Growth of Farmed Fish

When it comes to the food used to raise fish in aquaculture "farms," it seems that you may get what you pay for. In a new study,* researchers from the National Institute of Standards and Technology (NIST) and the South Carolina Department of Natural Resources (SCDNR) looked at the health effects of raising farmed fish on a diet incorporating less than the usual amount of fishmeal—a key but expensive component of current commercial fish food products. They learned that reduced fishmeal diets may be cheaper, but the fish were less healthy.

cobia fish
Cobia (Rachycentron canadum) harvested from a fish farm.
Credit: NOAA
View hi-resolution image

Commercial aquaculture is one of the fastest growing areas of food production, produces about $100 billion of revenue annually and accounts for nearly half of the world's food fish supply. Aquafarmers currently rely heavily on fishmeal as a protein source but it's expensive to produce and the resource from which it's derived—fish captured in the wild—is being rapidly depleted. One proposed remedy is to substitute cheaper and more environmentally friendly foods that replace some fishmeal content with other sources of protein.

SCDNR designed a study to evaluate the efficacy of diets with reduced and full amounts of fishmeal fed to cobia**, a popular marine aquaculture fish, during the period when juveniles mature to adults. One diet contained 50 percent and another 75 percent less fishmeal than that found in commercial food products. A third diet contained 100 percent of the conventional fishmeal content. A fourth group of cobia ate off-the-shelf fish food as a control.

To determine whether or not the three experimental diets provided adequate nutrition for fish growth, the SCDNR teamed with NIST's nuclear magnetic resonance (NMR) spectroscopy experts at the Hollings Marine Laboratory (HML) in Charleston, S.C. NMR spectroscopy, a technique similar to magnetic resonance imaging (MRI) used by doctors, allows researchers to isolate and identify specific nutrients after the fish have metabolized them—a quantifiable measure of how well or how poorly the different fishmeal diets were utilized.

The results showed that cobia fed the reduced fishmeal diets were metabolically different from those fed either the full fishmeal diet or the control diet. Fish fed the reduced fishmeal diets had higher levels of two metabolites linked to physical stress, tyrosine and betaine, and lower levels of a primary energy source, glucose. This suggests that these cobia were not receiving the necessary nutrition to support healthy growth.

Overall, the researchers were surprised to find that cobia on the experimental 100 percent fishmeal diet showed the most growth by the end of the 100-day study period. Along with more normal tyrosine, betaine and glucose levels, NMR spectroscopy also revealed significantly higher levels of lactate in cobia fed 100 percent fishmeal compared to fish on the other diets. This finding may be explained by the fact that the 100 percent fishmeal experimental diet has the highest percentage of the carbohydrate cornstarch, and lactate is produced by gut bacteria metabolizing carbohydrates. In turn, since efficient breakdown of carbohydrates is essential to energy production, the researchers surmise that a diet enhancing gut microflora activity might be one of the conditions needed for optimal cobia health.

Although the reduced fishmeal diets in this study did not fare well, the NIST and SCDNR researchers say that the data from the NMR-based metabolomic analysis still provide insight into what might be needed for more successful formulations. They expect that future studies will eventually lead to alternative dietary products that are more cost effective, better for the environment and lead to high yields of healthy fish.

The HML is a unique partnership of governmental and academic agencies including NIST, the National Oceanic and Atmospheric Administration's National Ocean Service, the SCDNR, the College of Charleston and the Medical University of South Carolina. The HML NMR facility focuses on the multi-institutional mission of metabolomics, natural products and structural biology.

* T.B. Schock, S. Newton, K. Brenkert, J. Leffler and D.W. Bearden. An NMR-based metabolomic assessment of cultured cobia health in response to dietary manipulation. Food Chemistry, Vol. 133, No. 1, pages 90-101, July 1, 2012.
** Rachycentron canadum.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

New Protocol Enables Wireless and Secure Biometric Acquisition with Web Services

Researchers at the National Institute of Standards and Technology (NIST) have developed and published a new protocol for communicating with biometric sensors over wired and wireless networks—using some of the same technologies that underpin the web.

The new protocol, called WS-Biometric Devices (WS-BD), allows desktops, laptops, tablets and smartphones to access sensors that capture biometric data such as fingerprints, iris images and face images using web services. Web services themselves are not new; for example, video-on-demand services use web services to stream videos to mobile devices and televisions.

The WS-Biometric Devices protocol will greatly simplify setting up and maintaining secure biometric systems for verifying identity because such biometric systems will be easier to assemble with interoperable components compared to current biometrics systems that generally have proprietary device-specific drivers and cables. WS-BD enables interoperability by adding a device-independent web-services layer in the communication protocol between biometric devices and systems.

Remember the last time you bought a new computer only to learn that you then had to upgrade your printer and find the appropriate drivers? For system owners, the difficulty of upgrading devices on a biometric system can mean significant costs. Using the WS-BD protocol eliminates that problem.

"This would be useful to many organizations that house biometric systems, including border control and customs agencies," explained computer scientist Kevin Mangold. Using current biometric systems, when one biometric sensor breaks, it can be expensive and time-consuming to find a replacement because manufacturers often change product lines and phase out previous generation devices. A few broken devices could entail having to rebuild the entire system, upgrade devices and drivers that may be incompatible with host operating systems, and retrain personnel, he said.

Biometrics are playing an increasing role in security, access control and identity management. And their use is expanding—for example, some theme parks use biometrics for access control. Fingerprints are used in conjunction with passwords for computer security. Many jobs require employees to provide biometrics; using WS-BD equipment could potentially reduce costs by facilitating interoperability in biometrics devices.

A 2010 National Academies study, Biometric Recognition: Challenges and Opportunities, recognized that "Biometric systems should be designed to anticipate the development and adoption of new advances and standards, modularizing components that are likely to become obsolete, such as biometric sensors, and matcher systems, so that they can be easily replaced."

NIST researchers recognized this need several years ago and developed a solution with the support of the Department of Homeland Security Science and Technology Directorate, the Federal Bureau of Investigation's Biometric Center of Excellence and NIST's Comprehensive National Cybersecurity Initiative. NIST also is working with industry through the Small Business Innovation Research Program to help bring these plug-and-play biometric devices to market.

Two NIST researchers recently demonstrated the NIST-developed WS-BD system in their lab using a tablet and two biometric sensors (see video). A tap on the tablet signals the web-enabled fingerprint sensor to capture four fingerprints from the individual whose hand is on the scanner and send it back to the tablet. A tap on another button controls a camera to take a photo for facial recognition.

The new protocol, Specification for WS-Biometric Devices (NIST Special Publication 500-288) can be found at www.nist.gov/manuscript-publication-search.cfm?pub_id=910334. Additional information on this and related projects can be found at http://bws.nist.gov.

While this is a final document, NIST welcomes your feedback, comments and questions for considerations for future updates. Send your comments to the WS-BD teams by emailing 500-288comments@nist.gov.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

First Light: NIST Researchers Develop New Way to Generate Superluminal Pulses

Researchers at the National Institute of Standards and Technology (NIST) have developed a novel way of producing light pulses that are "superluminal"—in some sense they travel faster than the speed of light.* The technique, called four-wave mixing, reshapes parts of light pulses and advances them ahead of where they would have been had they been left to travel unaltered through a vacuum. The new method could be used to improve the timing of communications signals and to investigate the propagation of quantum correlations.

fast light
In four-wave mixing, researchers send "seed" pulses of laser light into a heated cell containing atomic rubidium vapor along with a separate "pump" beam at a different frequency. The vapor amplifies the seed pulse and shifts its peak forward, making it superluminal. At the same time, photons from the inserted beams interact with the vapor to generate a second pulse called the "conjugate." Its peak, too, can travel faster or slower depending on how the laser is tuned and the conditions inside the gain medium.
Credit: NIST
View hi-resolution image

According to Einstein's special theory of relativity, light traveling in a vacuum is the universal speed limit. No information can travel faster than light.

But there's kind of a loophole. A short burst of light arrives as a sort of (usually) symmetric curve like a bell curve in statistics. The leading edge of that curve can't exceed the speed of light, but the main hump, the peak of the pulse, can be skewed forward or backward, arriving sooner or later than it normally would.

Recent experiments have generated "uninformed" faster-than-light pulses by amplifying the leading edge of the pulse and attenuating, or cutting off, the back end. The method introduces a great deal of noise with no great increase in the apparent speed. Four-wave mixing produces cleaner, less noisy pulses with a greater increase in speed by "re-phasing" or rearranging the light waves that make up the pulse.

In four-wave mixing, researchers send 200-nanosecond-long "seed" pulses of laser light into a heated cell containing atomic rubidium vapor along with a separate "pump" beam at a different frequency from the seed pulses. The vapor amplifies the seed pulse and shifts its peak forward so that it becomes superluminal. At the same time, photons from the inserted beams interact with the vapor to generate a second pulse, called the "conjugate" because of its mathematical relationship to the seed. Its peak, too, can travel faster or slower depending on how the laser is tuned and the conditions inside the laser.

In the experiment, the pulses' peaks arrived 50 nanoseconds faster than light traveling through a vacuum.

One immediate application that the group would like to explore for this system is quantum discord. Quantum discord mathematically defines the quantum information shared between two correlated systems—in this case, the seed and conjugate pulses. By performing measurements of quantum discord between fast beams and reference beams, the group hopes to determine how useful this fast light could be for the transmission and processing of quantum information.

* R. Glasser, U. Vogl and P. Lett. Stimulated generation of superluminal light pulses via four-wave mixing. Physical Review Letters, published online April 26, 2012.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

Fabrication Method Can Affect the Use of Block Copolymer Thin Films

A new study by a team including scientists from the National Institute of Standards and Technology (NIST) indicates that thin polymer films can have different properties depending on the method by which they are made. The results* suggest that deeper work is necessary to explore the best way of creating these films, which are used in applications ranging from high-tech mirrors to computer memory devices.

douglas films
The method of creating a thin film can have great effect on the material, such as the orientation of the tiny cylinders in this film proposed for use in computer memory. One method of film creation is far more effective at creating copolymer films with cylinders that stand on end (b), as they must to be usable. Scale bar represents 200 nanometers.
Credit: NIST
View hi-resolution image

Thin films spread atop a surface have many applications in industry. Inexpensive organic solar cells might be made of such films, to name one potential use. Typically they're made by dissolving the polymer, and then spreading a small amount of the liquid out on a surface, called a substrate. The solution becomes a film as the solvent dries and the remainder solidifies. But as this happens, stresses develop within the film that can affect its structure.

Manufacturers would like to know more about how to control these stresses to ensure the film does what they want. But scientists who study film formation often use a different method of casting films than a manufacturer would. One method used in industry is "flow coating"—similar to spreading frosting across a cake. Another method is "spin casting"—placing a drop of liquid on a substrate that spins rapidly and spreads the droplet out evenly by centrifugal force. Both methods create smooth films generally, but the team decided to examine whether the two methods create different effects in finished films consisting of a self-assembling block copolymer.

"It's an important question because some proposed applications intend to take advantage of these effects," Douglas says.

The team's comparison led to results that surprised them. Although the rapid spinning of spin casting is very dynamic, suggesting it would convey more stress to the resulting film, it actually led to fewer residual stresses than flow coating did. As previous studies have shown that leftover solvent can lead to stresses in the film, the team's new theory is that because the solvent evaporates from the developing film more slowly in flow coating, this solvent discourages the film solids from arranging themselves into the equilibrium structure.

For one example, the practical benefits of this understanding could help manufacturers who propose making computer memory devices from thin films in which the solids arrange themselves as tiny cylinders in the film. Such devices would require the cylinders to stand on end, not lay down flat.

"We find we can get them to stand up much more easily with one casting method than another," Douglas says. "If we can get better results simply by varying the mode of film casting, we need to explore more deeply what happens when you make films by different methods."

* X. Zhang, J.F. Douglas and R.L. Jones. Influence of film casting method on block copolymer ordering in thin films. Soft Matter, Mar. 21, 2012. doi:10.1039/C2SM07308K.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

Ultrasound Idea: Prototype NIST/CU Bioreactor Evaluates Engineered Tissue While Creating It

Researchers at the National Institute of Standards and Technology (NIST) have developed a prototype bioreactor—a device for culturing cells to create engineered tissues—that both stimulates and evaluates tissue as it grows, mimicking natural processes while eliminating the need to stop periodically to cut up samples for analysis. Tissue created this way might someday be used to replace, for example, damaged or diseased cartilage in the knee and hip.

Jenni Popp
Biomedical engineer Jenni Popp with NIST’s prototype bioreactor for tissue engineering. The bioreactor both stimulates and evaluates engineered tissue as it grows.
Credit: Burrus/NIST
View hi-resolution image

Conventional methods for evaluating the development and properties of engineered tissue are time-consuming, destructive and need to be repeated many times. By using ultrasound to monitor tissue during processing without destroying it, the novel bioreactor could be a faster and less expensive alternative.

"Most bioreactors don't do any type of nondestructive evaluation," says NIST postdoctoral researcher Jenni Popp, first author of a new paper* about the instrument. "Having some sort of ongoing evaluation of the developing tissue is definitely novel."

Cartilage is smooth connective tissue that supports joint motion. Natural cartilage is created by specialized cells that generate large amounts of structural proteins to weave a tough support material called extracellular matrix. Lacking blood vessels, cartilage has limited capability to heal from arthritis, sports injuries or other defects. Damage can be treated with drugs or joint replacement but results can be imperfect. Engineered tissue is used in some medical treatments but is not yet a routine alternative to metal or plastic joint replacements. The NIST bioreactor gives researchers a noninvasive way to monitor important structural changes in developing tissue.

bioreactor drawing
Drawing of NIST’s prototype bioreactor for tissue engineering. The bioreactor both stimulates and evaluates engineered tissue as it grows. Samples consisting of a mixture of cartilage cells and water-based gels (pink in the drawing) are placed in wells, and force is applied to the sample from above to mimic natural stimuli. Ultrasound techniques monitor tissue changes over time.
Credit: Gallagher/NIST
View hi-resolution image

The NIST/CU bioreactor can fit inside a standard incubator, which controls temperature and acidity in the growth environment. The bioreactor applies force to stimulate five small cubes of cartilage cells embedded in water-based gels. The mechanical force mimics the natural stimuli needed for the cells to create matrix proteins and develop the structure and properties of real cartilage. Ultrasound techniques monitor tissue changes over time, while a digital video microscope takes images.

Preliminary studies indicate the bioreactor both stimulates and monitors development of cells, matrix content and scaffolds to make three-dimensional engineered cartilage. The cell-laden gels were stimulated twice daily for an hour. Sulfated glycosaminoglycan (sGAG)—which combines with fibrous proteins to form the extracellular matrix—increased significantly after seven days. This structural change was detected by a significant decrease in ultrasound signals after seven days.

The research described in the new paper was performed at and led by NIST. The bioreactor is a collaborative project with several co-authors from the University of Colorado Boulder (CU) Department of Chemical and Biological Engineering.

NIST and CU researchers continue to develop ultrasonic measurement methods and plan to conduct longer experiments. The bioreactor is also being used by other academic researchers as a tool for validating mathematical models of biokinetics, the study of growth and movement in developing tissue.

* J.R. Popp, J.J. Roberts, D.V. Gallagher, K.S. Anseth, S.J. Bryant and T.P. Quinn. An instrumented bioreactor for mechanical stimulation and real-time, nondestructive evaluation of engineered cartilage tissue. Journal of Medical Devices, June 2012, Vol. 6, issue 2, 021006, posted online April 26.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

Light Touch Keeps a Grip on Delicate Nanoparticles

Using a refined technique for trapping and manipulating nanoparticles, researchers at the National Institute of Standards and Technology (NIST) have extended the trapped particles' useful life more than tenfold.* This new approach, which one researcher likens to "attracting moths," promises to give experimenters the trapping time they need to build nanoscale structures and may open the way to working with nanoparticles inside biological cells without damaging the cells with intense laser light.

optical tweezers
NIST researchers’ new approach to trapping nanoparticles uses a control and feedback system that nudges them only when needed, lowering the average intensity of the beam and increasing the lifetime of the nanoparticles while reducing their tendency to wander. On the left, 100-nanometer gold nanoparticles quickly escape from a static trap while gold nanoparticles trapped using the NIST method remained strongly confined.
Credit: NIST
View hi-resolution image

Scientists routinely trap and move nanoparticles in a solution with "optical tweezers"—a laser focused to a very small point. The tiny dot of laser light creates a strong electric field, or potential well, that attracts particles to the center of the beam. Although the particles are attracted into the field, the molecules of the fluid they are suspended in tend to push them out of the well. This effect only gets worse as particle size decreases because the laser's influence over a particle's movement gets weaker as the particle gets smaller. One can always turn up the power of the laser to generate a stronger electric field, but doing that can fry the nanoparticles too quickly to do anything meaningful with them—if it can hold them at all.

NIST researchers' new approach uses a control and feedback system that nudges the nanoparticle only when needed, lowering the average intensity of the beam and increasing the lifetime of the nanoparticle while reducing its tendency to wander. According to Thomas LeBrun, they do this by turning off the laser when the nanoparticle reaches the center and by constantly tracking the particle and moving the tweezers as the particle moves.

"You can think of it like attracting moths in the dark with a flashlight," says LeBrun. "A moth is naturally attracted to the flashlight beam and will follow it even as the moth flutters around apparently at random. We follow the fluttering particle with our flashlight beam as the particle is pushed around by the neighboring molecules in the fluid. We make the light brighter when it gets too far off course, and we turn the light off when it is where we want it to be. This lets us maximize the time that the nanoparticle is under our control while minimizing the time that the beam is on, increasing the particle's lifetime in the trap."

Using this method at constant average beam power, 100-nanometer gold particles remained trapped 26 times longer than had been seen in previous experiments. Silica particles 350 nanometers in diameter lasted 22 times longer, but with the average beam power reduced by 33 percent. LeBrun says that their approach should be able to be combined with other techniques to trap and hold even smaller nanoparticles for extended periods without damaging them.

"We're more than an order of magnitude ahead of where we were before," says LeBrun. "We now hope to begin building complex nanoscale devices and testing nanoparticles as sensors and drugs in living cells."

* A. Balijepalli, J. Gorman, S. Gupta and T. LeBrun. Significantly Improved Trapping Lifetime of Nanoparticles in an Optical Trap using Feedback Control. Nano Letters. April 10, 2012. Available online http://3249238492kljf-pubs.acs.org/doi/abs/10.1021/nl300301x.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

NIST Mini-sensor Measures Magnetic Activity in Human Brain

A miniature atom-based magnetic sensor developed by the National Institute of Standards and Technology (NIST) has passed an important research milestone by successfully measuring human brain activity. Experiments reported this week* verify the sensor's potential for biomedical applications such as studying mental processes and advancing the understanding of neurological diseases.

brain sensor
NIST's atom-based magnetic sensor, about the size of a sugar cube, can measure human brain activity. Inside the sensor head is a container of 100 billion rubidium atoms (not seen), packaged with micro-optics (a prism and a lens are visible in the center cutout). The light from a low-power infrared laser interacts with the atoms and is transmitted through the grey fiber-optic cable to register the magnetic field strength. The black and white wires are electrical connections.
Credit: Knappe/NIST
View hi-resolution image

NIST and German scientists used the NIST sensor to measure alpha waves in the brain associated with a person opening and closing their eyes as well as signals resulting from stimulation of the hand. The measurements were verified by comparing them with signals recorded by a SQUID (superconducting quantum interference device). SQUIDs are the world's most sensitive commercially available magnetometers and are considered the "gold standard" for such experiments. The NIST mini-sensor is slightly less sensitive now but has the potential for comparable performance while offering potential advantages in size, portability and cost.

The study results indicate the NIST mini-sensor may be useful in magnetoencephalography (MEG), a noninvasive procedure that measures the magnetic fields produced by electrical activity in the brain. MEG is used for basic research on perceptual and cognitive processes in healthy subjects as well as screening of visual perception in newborns and mapping brain activity prior to surgery to remove tumors or treat epilepsy. MEG also might be useful in brain-computer interfaces.

MEG currently relies on SQUID arrays mounted in heavy helmet-shaped flasks containing cryogenic coolants because SQUIDs work best at 4 degrees above absolute zero, or minus 269 degrees Celsius. The chip-scale NIST sensor is about the size of a sugar cube and operates at room temperature, so it might enable lightweight and flexible MEG helmets. It also would be less expensive to mass produce than typical atomic magnetometers, which are larger and more difficult to fabricate and assemble.

"We're focusing on making the sensors small, getting them close to the signal source, and making them manufacturable and ultimately low in cost," says NIST co-author Svenja Knappe. "By making an inexpensive system you could have one in every hospital to test for traumatic brain injuries and one for every football team."

The mini-sensor consists of a container of about 100 billion rubidium atoms in a gas, a low-power infrared laser and fiber optics for detecting the light signals that register magnetic field strength—the atoms absorb more light as the magnetic field increases. The sensor has been improved since it was used to measure human heart activity in 2010.** NIST scientists redesigned the heaters that vaporize the atoms and switched to a different type of optical fiber to enhance signal clarity.

The brain experiments were carried out in a magnetically shielded facility at the Physikalisch Technische Bundesanstalt (PTB) in Berlin, Germany, which has an ongoing program in biomagnetic imaging using human subjects. The NIST sensor measured magnetic signals of about 1 picotesla (trillionths of a tesla). For comparison, the Earth's magnetic field is 50 million times stronger (at 50 millionths of a tesla). NIST scientists expect to boost the mini-sensor's performance about tenfold by increasing the amount of light detected. Calculations suggest an enhanced sensor could match the sensitivity of SQUIDS. NIST scientists are also working on a preliminary multi-sensor magnetic imaging system in a prelude to testing clinically relevant applications.

* T.H. Sander, J. Preusser, R. Mhaskar, J. Kitching, L. Trahms and S. Knappe.  Magnetoencephalography with a chip-scale atomic magnetometer. Biomedical Optics Express. Vol. 3, Issue 5, pp. 981–990. Published online April 17.
** See the 2010 NIST Tech Beat article, "NIST Mini-Sensor Traces Faint Magnetic Signature of Human Heartbeat," at www.nist.gov/pml/div688/magnetic_101310.cfm.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

Cloud Computing Forum & Workshop V Meets June 5-7 at the Department of Commerce

The National Institute of Standards and Technology (NIST) is hosting Cloud Computing Forum & Workshop V on June 5-7, 2012, at the Department of Commerce’s Herbert C. Hoover Building in Washington, D.C.

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service-provider interaction. Government, industry, academia and individuals are using cloud services.

NIST’s role in cloud computing is to help accelerate the secure and effective adoption of cloud computing in the federal government. The agency leads efforts to develop standards and guidelines and advance cloud computing technology in collaboration with standards bodies, businesses and other private-sector organizations, government agencies and other stakeholders.

Two sessions of this meeting focus on reviewing progress on the Priority Action Plans (PAPs) for each of the 10 high-priority requirements related to interoperability, portability and security that were identified by U.S. government agencies for adopting cloud computing. These requirements were first announced at the Cloud Computing Forum & Workshop IV in November 2011, and were published in the first release of the USG Cloud Computing Technology Roadmap, Volume I (NIST SP 500-293).* The two sessions will showcase the voluntary, independent, cloud-related PAP efforts underway by industry, academia and standards-developing organizations on these 10 high-priority requirements.

Government executives from Europe, Asia and North America, including U.S. Chief Information Officer Steve VanRoekel, will discuss views on the potential for cloud computing technology to transform public services. In another session, representatives from major standard-developing organizations will tackle the question, “Are there too many cooks in the kitchen?”

Other panels will cover “Industry Views on the Cloud Computing Model Development, Adoption and Next Steps” and “Target Business Use Cases for Government Cloud Computing Development.” The discussion of “USG Federal CC Effort: How Do They Fit Together” features representatives from the Department of Energy, General Services Administration and the Office of Management and Budget.

The workshop runs through the morning of June 7 and offers sessions on cloud security, cloud reference architecture and taxonomy, and updates in NIST cloud-related programs.

For more information on the Cloud Computing Forum & Workshop V, including the agenda and registration, visit www.nist.gov/itl/cloud/cloudworkshopv.cfm.

* Available atwww.nist.gov/itl/cloud/upload/SP_500_293_volumeI-2.pdf.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

Next Generation Rail Forums Head to Kansas City and Orlando

The National Institute of Standards and Technology (NIST) and the U.S. Department of Transportation (DOT) will host two forums next month designed to connect manufacturers with more than 34,000 domestic suppliers for an upcoming passenger rail car procurement. The two forums, which will be held May 3, 2012, in Kansas City, Mo., and May 8 in Orlando, Fla., will help identify a broader domestic supply base of potential suppliers for the procurement, with the goal of stimulating interest among domestic suppliers to compete for 100 percent domestic content in the rail cars.

The one-day forums will include representatives from federal and state agencies, local economic development agencies, rail car builders and associated original equipment manufacturers (OEMs), and U.S. manufacturers who potentially could be suppliers.

For the Kansas City event, NIST’s Hollings Manufacturing Extension Partnership (MEP) and DOT’s Federal Railroad Administration are partnering with the Kansas and Missouri MEP affiliates Mid-America Manufacturing Technology Center (MAMTC) and Missouri Enterprise. They will be joined by several additional partnering organizations to sponsor the event using a grant received through a multi-agency Jobs and Innovation Accelerator Grant program.

NIST’s MEP and DOT’s Federal Railroad Administration are working with the Florida Manufacturing Extension Partnership and the other MEP Centers located in every U.S. state and Puerto Rico to host the Orlando Next Generation Rail forum at Manufacturing Innovation 2012, the MEP national conference.

In February 2012, the first two forums were held in Sacramento and Chicago, with total attendance of more than 400 representatives of U.S. manufacturing. The prospective suppliers came from 24 U.S. states to connect with 12 OEMs and tier-one suppliers (direct suppliers to OEMs) in more than 300 one-on-one meetings.

“The forums in Sacramento and Chicago provided U.S. manufacturers with information on upcoming procurement opportunities that may provide a path for growing their business through expansion into an emerging market,” said Aimee Dobrzeniecki, deputy director of NIST MEP.

The forums are the outgrowth of a partnership* formed in October 2011 between the U.S. Department of Commerce and DOT to leverage agency capabilities to ensure development of a domestic supply base to support transportation in the United States.

Visit www.nist.gov/mep/rail.cfm to learn more about the Next Generation Rail Supply Chain Connectivity initiative, to register for upcoming forums, and to access summaries from past forums and the latest information about the rail industry supplier opportunities.

* Read the blog entry, “Commerce and Transportation Departments Forge Partnership to Boost Domestic Manufacturing Across America,” at www.commerce.gov/blog/2011/10/18/commerce-and-transportation-departments-forge-partnership-boost-domestic-manufacturi.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top

NIST Fire Research Earns Honors

Fire researchers at the National Institute of Standards and Technology (NIST) recently were honored by the International Association of Arson Investigators (IAAI), while the Fire Department of New York has just announced that it will name NIST fire protection engineer Daniel Madrzykowski an honorary fire battalion chief.

At its annual meeting on April 25 in Dover, Del., IAAI recognized NIST for helping to organize and provide instructional support for training and certifications programs for federal, state and local fire investigators. These programs often include exercises carried out at supervised burns, which use abandoned structures that are instrumented to accommodate controlled fire experiments. IAAI used video taken at fire experiments in Bensenville, Ill., townhouses as the basis for a training video, “The First Responder’s Role in Fire Investigation.”

The IAAI Outstanding Accomplishment Award also recognizes the NIST Fire Research Division for using the findings from its technical investigation of the 2007 Charleston Sofa Super Store Fire to assist in preparing an on-line training video that conveys lessons learned from that study (accessible at www.CFITrainer.net). That fire resulted in the deaths of nine firefighters. IAAI, which has a membership of more than 5,000 fire investigation professionals from around the world, singled out the value of the NIST Engineering Laboratory’s Fire Dynamics Simulator and its usefulness in reconstructing fire behavior.

A registered engineer, Madrzykowski conducts research in the areas of fire suppression, large-fire measurements, fire investigation and firefighter safety. He is also a member of the National Fire Protection Association and the International Association of Arson Investigators and is a Society of Fire Protection Engineers fellow.

Madrzykowski and NIST colleagues collaborated with the Fire Department of New York City (FDNY) and the Polytechnic Institute of New York University on research that determined how wind affects fires in high-rise buildings. That study included field experiments in an abandoned seven-story building on Governors Island, N.Y. The results confirmed that conditions created by wind can push hot gases and smoke from the apartment of origin into the public corridors and stairwells.

A set of instructional videos based on the research is available for firefighter training to improve safety for civilians and firefighters. Since the study was completed, Madrzykowski has continued to work with members of the FDNY as they modify their firefighting tactics to incorporate the latest research findings.

New York Fire Commissioner Salvatore J. Cassano will appoint Madrzykowski to the rank of honorary battalion chief at a ceremony on May 31.

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

Comment  Comment on this article.back to top