NIST logo

Tech Beat - June 9, 2010

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: July 6, 2010
Date Modified: July 6, 2010 
Contact: inquiries@nist.gov

NIST/JILA 'Dark Pulse Laser' Produces Bursts of … Almost Nothing

In an advance that sounds almost Zen, researchers at the National Institute of Standards and Technology (NIST) and JILA, a joint institute of NIST and the University of Colorado at Boulder, have demonstrated a new type of pulsed laser that excels at not producing light. The new device generates sustained streams of “dark pulses”—repeated dips in light intensity—which is the opposite of the bright bursts in a typical pulsed laser.

colorized traces of pulses

Colorized trace of pulses from the NIST/JILA “dark pulse” laser, indicating the light output nearly shuts down about every 2.5 nanoseconds.

Credit: NIST
View hi-resolution image

Despite its ominous name, the dark pulse laser is envisioned as a tool for benign communications and measurements based on infrared light frequencies. The laser’s ultrashort pulses span just 90 picoseconds (trillionths of a second), making the device suitable for measurements on short timescales. Dark pulses might be useful in signal processing because, unlike bright pulses, they generally propagate without distortion. Dark pulses might be used like a camera shutter for a continuous light beam in optical networks.

Described in Optics Express,* the new NIST/JILA technology is the first to generate dark pulses directly from a semiconductor laser cavity, without electrical or optical shaping of pulses after they are produced. The chip-sized infrared laser generates light from millions of quantum dots (qdots), nanostructured semiconductor materials grown at NIST. Quantum dot lasers are known for unusual behavior.

In the new NIST/JILA laser, small electrical currents are injected into the laser, causing the qdots to emit light. The qdots are all about the same size—about 10 nanometers (billionths of a meter) wide—and thus, because of a nanostructured design that makes them behave like individual atoms, all emit light at the same frequency. The current generates enough energy to amplify the emissions from the collective dots, creating the special properties of laser light.

The new laser depends on the qdots’ unusual energy dynamics, which have the effect of stabilizing dark pulses. After emitting light, qdots recover energy from within rapidly (in about 1 picosecond) but more slowly (in about 200 picoseconds) from energy inputs originating outside the qdots in the laser cavity. This creates a progression of overall energy gains gradually giving way to overall energy losses. Eventually, the laser reaches a steady state of repeated brief intensity dips—a drop of about 70 percent—from the continuous light background.

The dark pulse laser was developed through close collaborations between NIST experts in qdot growth and semiconductor laser design and fabrication, and JILA experts in ultrafast lasers and related measurements. NIST has ongoing research efforts to develop quantum dot lasers and to develop modeling, fabrication, and measurement methods for semiconductor nanostructures such as quantum dots. In general, semiconductor lasers are being considered for many advanced applications, such as next-generation atomic clocks based on optical frequencies, for which large lasers are costly and complex.

* M. Feng, K.L. Silverman, R.P. Mirin and S.T. Cundiff. Dark pulse laser. Optics Express. Published online June 7, 2010, as forthcoming.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

'Instant Acid' Method Offers New Insight into Nanoparticle Dispersal in the Environment and the Body

Using a chemical trick that allows them to change the acidity of a solution almost instantly, a team at the National Institute of Standards and Technology (NIST) has demonstrated a simple and effective technique for quantifying how the stability of nanoparticle solutions change when the acidity of their environment suddenly changes*. The measurement method and the problem studied are part of a broader effort at NIST to understand the environmental, health and safety implications of nanoparticles.

nanoparticles test results graphs

Successive test runs at NIST show how clumping of typical nanoparticles in a solution depends on changes in acidity. Time after acidity jump is shown on the horizontal axis, while the vertical axis is a measure of the size of the nanoparticle aggregates. As pH goes down (and acidity up), both rate of aggregation and size of clumps of nanoparticles goes up.

Credit: R. Murphy/NIST
View hi-resolution image

Any change in nanoparticle solubility with local acidity (pH**) ultimately affects how they are distributed in the environment as well as their potential for uptake into organisms. This is crucial when designing nanoparticles for use in medicine, explains NIST chemical engineer Vivek Prabhu. “Cells in the body are very compartmentalized. There are places within the cell that have vastly different pH. For instance, in the sea of the cell, the cytosol, pH is regulated to be about 7.2, which is slightly basic. But within the lysosome, which is where things go to get broken down, the pH is about 4.5, so it’s very acidic.”

Nanoparticles designed for use in drug therapy or as contrast agents for medical imaging typically are coated with molecules to prevent the particles from clumping together, which would reduce their effectiveness. But the efficacy of the anti-clumping coating often depends on the pH of the environment. According to the NIST team, while it’s relatively easy to put nanoparticles in a solution at a particular pH and to study the stability of the suspension over long times, it is difficult to tell what happens when the particles are suddenly exposed to a different level of acidity as often occurs in environmental and application contexts. How long does it take them to react to this change and how?

“Our idea borrows some of the materials used in photolithography to make microcircuits,” says Prabhu. “There are molecules that become acids when you shine a light on them—photo acid generators. So instead of manually pouring acid into a solution and stirring it around, you start with a solution in which these molecules already are mixed and dissolved. Once you shine light on it …bam! Photolysis occurs and it becomes acidic.” The acidity of the solution can be made to jump a major step—an amount chosen by the experimenter—without needing to wait for mixing or disturbing the solution. “It gives you a way to probe the nanoparticle solution dynamics at much shorter timescales than before,” says Prabhu.

Using their “instant acid” technique and light scattering instruments to monitor the aggregation of nanoparticles, the NIST team followed the growth of clusters of chemically stabilized latex nanoparticles for the first few seconds after inducing the pH transition with light. Their results demonstrate that under certain conditions, the stability of the nanoparticles—their tendency to resist clumping—becomes very sensitive to pH. Studies such as these could provide a stronger foundation to design nanoparticles for applications such as targeting tumor cells that have levels of acidity markedly different from normal cells.

The work was supported in part by the National Research Council–NIST Postdoctoral Fellowship Program.

* R.J. Murphy, D. Pristinski, K. Migler, J.F. Douglas and V.M. Prabhu. Dynamic light scattering investigations of nanoparticle aggregation following a light-induced pH jump. Journal of Chemical Physics. 132, 194903 (2010) doi:10.1063/1.3425883.

** pH is the common measure used by chemists of how acidic or basic a solution is. The scale runs from 0 to 14; lower values are more acidic, higher values more basic; 7 is considered neutral.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top

Liposome-Hydrogel Hybrids: No Toil, No Trouble for Stronger Bubbles

People have been combining materials to bring forth the best properties of both ever since copper and tin were merged to start the Bronze Age. In the latest successful merger, researchers at the National Institute of Standards and Technology (NIST), the University of Maryland (UM) and the U.S. Food and Drug Administration (FDA) have developed a method to combine two substances that individually have generated interest for their potential biomedical applications: a phospholipid membrane “bubble” called a liposome and particles of hydrogel, a water-filled network of polymer chains. The combination forms a hybrid nanoscale (billionth of a meter) particle that may one day travel directly to specific cells such as tumors, pass easily though the target’s cell membrane, and then slowly release a drug payload.

schematic of liposome-hydrogel hybrids

Schematic depicting the creation of liposome-hydrogel hybrids. A solution containing phospholipid (“liposome precursor”) mixes with a solution containing hydrogel precursor (a). Blending together at the interface of the two channels, the phospholipid forms liposomes (b) that trap the hydrogel precursor inside. Material outside the vesicles is removed (c) and the liposomes are UV irradiated. This polymerizes the protein chains in the hydrogel and yields a liposome-hydrogel hybrid (d).

Credit: NIST
View hi-resolution image

In a recent paper in the journal Langmuir*, the research team reviewed how liposomes and hydrogel nanoparticles have individual advantages and disadvantages for drug delivery. While liposomes have useful surface properties that allow them to target specific cells and pass through membranes, they can rupture if the surrounding environment changes. Hydrogel nanoparticles are more stable and possess controlled release capabilities to tune the dosage of a drug over time, but are prone to degradation and clumping. The researchers’ goal was to engineer nanoparticles incorporating both components to utilize the strengths of each material while compensating for their weaknesses.

To manufacture their liposome-hydrogel hybrid vesicles, the researchers adapted a NIST-UM technique known as COMMAND for COntrolled Microfluidic Mixing And Nanoparticle Determination that uses a microscopic fluidic (microfluidic) device (see “NIST, Maryland Researchers COMMAND a Better Class of Liposomes” in NIST Tech Beat, April 27, 2010). In the new work, phospholipid molecules are dissolved in isopropyl alcohol and fed via a tiny (21 micrometers in diameter, or three times the size of a yeast cell) inlet channel into a “mixer” channel, then “focused” into a fluid jet by a water-based solution added through two side channels. Hydrogel precursor molecules are mixed in with the focusing fluid.

As the components blend together at the interfaces of the fluid streams, the phospholipid molecules self-assemble into nanoscale vesicles of controlled size and trap the monomers in solution inside. The newly formed vesicles then are irradiated with ultraviolet light to polymerize the hydrogel precursors they carry into a solid gel made up of cross-linked chains. These chains give strength to the vesicles while permitting them to retain the spherical shape of the liposome envelope (which, in turn, would facilitate passage through a cell membrane).

To turn the liposome-hydrogel hybrid vesicles into cellular delivery vehicles, a drug or other cargo would be added to the focusing fluid during production.

* J.S. Hong, S.M. Stavis, S.H. DePaoli Lacerda, L.E. Locascio, S.R. Raghavan and M. Gaitan. Microfluidic directed self-assembly of liposome-hydrogel hybrid nanoparticles. Langmuir, published online April 29, 2010.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

back to top

Walls Falling Faster for Solid-State Memory

After running a series of complex computer simulations, researchers have found that flaws in the structure of magnetic nanoscale wires play an important role in determining the operating speed of novel devices using such nanowires to store and process information. The finding*, made by researchers from the National Institute of Standards and Technology (NIST), the University of Maryland, and the University of Paris XI, will help to deepen the physical understanding and guide the interpretation of future experiments of these next-generation devices.

Magnetic nanowires store information in discrete bands of magnetic spins. One can imagine the nanowire like a straw sucking up and holding the liquid of a meticulously layered chocolate and vanilla milkshake, with the chocolate segments representing 1s and the vanilla 0s. The boundaries between these layers are called domain walls. Researchers manipulate the information stored on the nanowire using an electrical current to push the domain walls, and the information they enclose, through the wire and past immobile read and write heads.

Interpretations of experiments seeking to measure how domain walls move have largely ignored the effects of “disorder”—usually the result of defects or impurities in the structure of the nanowires. To see how disorder affects the motion of these microscopic magnetic domains, NIST researchers and their colleagues introduced disorder into their computer simulations.

Their simulations showed that disorder, which causes friction within the nanowires, can increase the rate at which a current can move domain walls.

According to NIST physicist Mark Stiles, friction can cause the domain walls to move faster because they need to lose energy in order to move down the wire.

For example, when a gyroscope spins, it resists the force of gravity. If a little friction is introduced into the gyroscope’s bearing, the gyroscope will fall over more quickly. Similarly, in the absence of damping, a domain wall will only move from one side of the nanowire to the other. Disorder within the nanowire enables the domain walls to lose energy, which gives them the freedom to “fall” down the length of the wire as they move back and forth.

“We can say that the domain wall is moving as if it were in a system that has considerably greater effective damping than the actual damping,” says NIST physicist and lead researcher Hongki Min. “This increase in the effective damping is significant enough that it should affect the interpretation of most future domain wall experiments.”

* H. Min, R.D. McMichael, M.J. Donahue, J. Miltat and M.D. Stiles. Effects of disorder and internal dynamics on vortex wall propagation. Phys. Rev. Lett. 104, 217201. May 26, 2010. http://prl.aps.org/abstract/PRL/v104/i21/e217201.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

back to top

NIST Helps Accelerate the Federal Government's Move to the Cloud

CIO On the Cloud—At a May 20, 2010, NIST-organized forum, Federal CIO Vivek Kundra calls for industry and government to work together to accelerate government cloud computing adoption by focusing on standards development and leveraging the cloud certification work performed by other agencies.

cookie disclaimer/privacy policy | download .mov

The National Institute of Standards and Technology (NIST) has been designated by Federal Chief Information Officer Vivek Kundra to accelerate the federal government’s secure adoption of cloud computing by leading efforts to develop standards and guidelines in close consultation and collaboration with standards bodies, the private sector, and other stakeholders. Computer science researchers at NIST are working on two complementary efforts to speed the government’s quick and secure adoption of cloud computing.

Cloud computing is an emerging model for obtaining on-demand access to shared computing resources often through the use of remotely located, widely distributed data networks. Kundra sees this new vehicle for shared computing services as a means to lower the cost of government operations, drive innovation and fundamentally change the way government delivers technology services across the board.

NIST has been involved in cloud computing since its inception and has developed a widely accepted definition of cloud computing. The lab is currently focused on two major cloud computing efforts.

One is leading a collaborative technical initiative known as the Standards Acceleration to Jumpstart Adoption of Cloud Computing (SAJACC) that is intended to validate and communicate interim cloud computing specifications, before they become formal standards.

The major cloud computing requirements that will be addressed by these interface specifications are security, portability (the ability to move data) and interoperability (the ability of different systems to work together seamlessly).

NIST researchers are working with other agencies and standards development organizations to identify existing specifications and requirements use cases—ways users interact with cloud systems such as sending data to a cloud service provider’s environment, and later retrieving it and removing it from that provider. The NIST approach will help to identify gaps in cloud computing standards and focus on those gaps. SAJACC researchers plan to create a portal to collect and share the use case, specification, and test results information.

Another major challenge with cloud computing is to safeguard government data in clouds, especially citizens’ private information. Agencies using cloud computing will still use NIST-developed Federal Information Security Management Act (FISMA) guidelines.

NIST is serving as the technical advisor for the Federal Risk and Authorization Management Program (FedRAMP), which will allow agencies to collaboratively develop baseline FISMA security criteria and authorization to operate deliverables upfront for use of cloud computing vendor products and services. This certification and accreditation and authorization process is designed to cut duplication of effort. Once a baseline is approved, each agency could augment the baseline according to its individual data and mission system security authorization needs. More information on FedRAMP is available at http://cio.gov/pages.cfm/page/Federal-Risk-and-Authorization-Management-Program-FedRAMP.

For more on NIST’s cloud computing work, including the NIST definition of cloud computing, visit http://csrc.nist.gov/groups/SNS/cloud-computing.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

back to top

NIST WTC Recommendations Are Basis for New Set of Revised Codes

Faster and more efficient emergency evacuations from buildings—especially tall structures—and better communications between first responders during an emergency are among the safety improvements expected from 17 major and far-reaching building and fire code changes approved recently by the International Code Council (ICC) based on recommendations from the National Institute of Standards and Technology (NIST). The recommendations were based on NIST’s investigation of the collapses of New York City’s World Trade Center (WTC) towers and WTC 7 on Sept. 11, 2001.

The new changes, adopted at the ICC hearings held May 15-23, 2010, in Dallas, Texas, will be incorporated into the 2012 edition of the ICC’s I-Codes (specifically the International Building Code, or IBC, and the International Fire Code, or IFC), a state-of-the-art model code used as the basis for building and fire regulations promulgated and enforced by U.S. state and local jurisdictions. Those jurisdictions have the option of incorporating some or all of the code’s provisions but generally adopt most provisions.

The 17 new code changes include important safety improvements to the existing requirements for elevators in tall buildings used during an emergency by occupants evacuating and firefighters entering, and provisions to ensure that emergency radio communications will effectively serve first responders throughout their local communities.

The newly adopted code changes are the second set adopted in the past two years by the ICC based on recommendations from the NIST WTC investigation. Twenty-three changes were approved in October 2008 and incorporated into the 2009 edition of the I-Codes.

“With their adoption and reaffirmation over two code cycles, we believe that the safety improvements stimulated by the NIST WTC investigation are now well integrated within the mainstream of U.S. building and fire codes,” said WTC Lead Investigator Shyam Sunder.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

Eighty-Three Organizations Seek the 2010 Baldrige Award

Eighty-three organizations are in the running for the 2010 Malcolm Baldrige National Quality Award, the nation’s highest recognition for organizational performance excellence through innovation and improvement. Applicants include three manufacturers, two service companies, seven small businesses, 10 educational organizations, 54 health care organizations and seven nonprofits/governmental organizations. The number of applicants is up 20 percent over 2009 and marks the fifth consecutive year that there have been 70 or more organizations seeking the award. Additionally, the 54 health care applicants are the largest number in that category since it began in 1999.

The 2010 applicants will be evaluated rigorously by an independent board of examiners in seven areas: leadership; strategic planning; customer focus; measurement, analysis and knowledge management; workforce focus; process management; and results. Examiners provide each applicant with 300 to 1,000 hours of review and a detailed report on the organization’s strengths and opportunities for improvement.

The 2010 Baldrige Award recipients are expected to be announced in late November, 2010.

Named after Malcolm Baldrige, the 26th Secretary of Commerce, the Baldrige Award was established by Congress in 1987. The award—managed by the National Institute of Standards and Technology (NIST) in collaboration with the private sector—promotes excellence in organizational performance, recognizes the achievements and results of U.S. organizations, and publicizes successful performance strategies. The award is not given for specific products or services. Since 1988, 80 organizations have received Baldrige Awards.

Thousands of organizations use the Baldrige Criteria for Performance Excellence to guide their enterprises, improve performance and get sustainable results. This proven improvement and innovation framework offers organizations an integrated approach to key management areas.

“I see the Baldrige process as a powerful set of mechanisms for disciplined people engaged in disciplined thought and taking disciplined action to create great organizations that produce exceptional results,” says Jim Collins, author of Good to Great: Why Some Companies Make the Leap ... and Others Don’t.

To learn more about starting or advancing your organization’s quality journey, go to www.nist.gov/baldrige/publications/criteria.cfm and www.nist.gov/baldrige/enter/self.cfm.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

back to top