NIST logo

Tech Beat - October 11, 2007

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: June 5, 2012
Date Modified: June 5, 2012 

New Quantum Dot Transistor Counts Individual Photons

A transistor containing quantum dots that can count individual photons (the smallest particles of light) has been designed and demonstrated at the National Institute of Standards and Technology (NIST). The semiconductor device could be integrated easily into electronics and may be able to operate at higher temperatures than other single-photon detectors—practical advantages for applications such as quantum key distribution (QKD) for “unbreakable” encryption using single photons.

field-effect transistor diagram
NIST's modified field-effect transistor can count single photons, or particles of light. When light enters through the transmission window (see electron micrograph of top of device), it penetrates the gallium arsenide absorbing layer and separates electrons from the ‘holes’ they formerly occupied. Quantum dots (red dots) trap the positively charged holes, while electrons flow into the channel (green Xs). By measuring the channel current, researchers can determine the number of photons absorbed.
Credit: NIST
View higher resolution image

The NIST device, described in a new paper,* can accurately count 1, 2 or 3 photons at least 83 percent of the time. It is the first transistor-based detector to count numbers of photons; most other types of single-photon detectors simply “click” in response to any small number of photons. (See table for a comparison of various types of single-photon detectors used at NIST.) Counting requires a linear, stepwise response and low-noise operation. This capability is essential for advanced forms of precision optical metrology—a focus at NIST—and could be used both to detect photons and to evaluate single-photon sources for QKD. The new device also has the potential to be cooled electronically, at much higher temperatures than typical cryogenic photon detectors.

Dubbed QDOGFET, the new detector contains about 1,000 quantum dots, nanoscale clusters of semiconductors with unusual electronic properties. The NIST dots are custom-made to have the lowest energy of any component in the detector, like the bottom of a drain. A voltage applied to the transistor produces an internal current, or channel. Photons enter the device and their energy is transferred to electrons in a semiconductor “absorbing layer,” separating the electrons from the “holes” they formerly occupied. As each photon is absorbed, a positively charged hole is trapped by the quantum dot drain, while the corresponding electron is swept into the channel. The amount of current flowing in the channel depends on the number of holes trapped by quantum dots. By measuring the channel response, scientists can count the detected photons. NIST measurements show that, on average, each trapped hole boosts the channel current by about one-fifth of a nanoampere. The detector has an internal quantum efficiency (percentage of absorbed photons that result in trapped holes) of 68 ± 18 percent, a record high for this type of photon detector.

The QDOGFET currently detects single photons at wavelengths of about 800 nanometers. By using different semiconductor materials, NIST researchers hope to make detectors that respond to the longer near-infrared wavelengths used in telecommunications. In addition, researchers hope to boost the external quantum efficiency (percentage of photons hitting the detector that are actually detected), now below 10 percent, and operate the device at faster speeds.

The research is supported in part by the Disruptive Technology Office. The authors include one from Los Alamos National Laboratory and one from Heriot-Watt University, Edinburgh, UK.

* E.J. Gansen, M.A. Rowe, M.B. Greene, D. Rosenberg, T.E. Harvey, M.Y. Su, R.H. Hadfield, S.W. Nam and R.P. Mirin. Photon-number-discriminating detection using a quantum dot, optically gated, field-effect transistor. Nature Photonics. 1, 585 - 588 (2007). Published on-line Oct. 1, 2007.

Media Contact: Laura Ost,, 303-497-4880

back to top

NIST Light Source Illuminates Fusion Power Diagnostics

Using a device that can turn a tiny piece of laboratory space into an ion cloud as hot as those found in a nuclear fusion reactor, physicists at the National Institute of Standards and Technology (NIST) are helping to develop one of the most exotic “yardsticks” on earth, an instrument to monitor conditions in the plasma of an experimental fusion reactor. Their measurement tool also is used in incandescent light bulbs–it’s the element tungsten.

ITER cutaway ITER
Published with permission of ITER
Engineering design image (l.) shows a cross-section of part of the planned ITER fusion reaction vessel. Diverter section (pale gray elements at the bottom) are sheathed in tungsten. Larger schematic (r.) showing a tiny human figure indicates the scale of the ITER toroid.

The intended beneficiary of this research is ITER, a multinational project to build the world’s most advanced fusion test reactor. ITER, now under construction in Cadarache, France, will operate at high power in near-steady-state conditions, incorporate essential fusion energy technologies and demonstrate safe operation of a fusion power system. It will be a tokamak machine, in which a hot—250 million degrees Celsius—plasma of hydrogen isotope ions, magnetically confined in a huge toroidal shape, will fuse to form helium nuclei and generate considerable amounts of energy, much the same way energy is generated in the sun.

One major issue is how to measure accurately the temperature and density of the plasma, both of which must reach critical values to maintain the fusion process. Any conventional instrument would be incinerated almost instantly. The usual solution would be to use spectroscopy: monitor the amount and wavelengths of light emitted by the process to deduce the state of the plasma. But light comes from electrons as they change their energies, and at tokamak temperatures the hydrogen and helium nuclei are completely ionized—no electrons left. The answer is to look at a heavier element, one not completely ionized at 250 million degrees, and the handy one is tungsten. The metal with the highest melting point, tungsten is used for critical structures in the walls of the tokamak torus, so some tungsten atoms always are present in the plasma.

To gather accurate data on the spectrum of highly ionized tungsten, as it would be in the tokamak, NIST physicists use an electron beam ion trap (EBIT), a laboratory instrument which uses a tightly focused electron beam to create, trap and probe highly charged ions. An ion sample in the EBIT is tiny—a glowing thread about the width of a human hair and two to three centimeters long—but within that area the EBIT can produce particle collisions with similar energies to those that occur in a fusion plasma or a star. In a pair of papers,* the NIST researchers uncovered previously unrecognized features of the tungsten spectrum, effects only seen at the extreme temperatures that produce highly charged ions. The NIST team has reported several previously unknown spectral lines for tungsten atoms with 39 to 47 of their 74 electrons removed. One particularly significant discovery was that an anomalously strong spectral line that appears at about the energies of an ITER tokamak is in fact a superposition of two different lines that result from electron interactions that, under more conventional plasma conditions, are too insignificant to show up.

Team member John Gillaspy observes, “That’s part of the fascination of these highly charged ions. Things become very strange and bizarre. Things that are normally weak become amplified, and some of the rules of thumb and scaling laws that you learned in graduate school break down when you get into this regime.” The team has proposed a possible new fusion plasma diagnostic based on their measurements of the superimposed lines and supporting theoretical and computational analyses.

Follow this link to read more on the NIST EBIT facility.

* Yu. Ralchenko. Density dependence of the forbidden lines in Ni-like tungsten. J. Phys. B: At. Mol. Opt. Phys. 40 (2007) F175-F180
Yu. Ralchenko, J. Reader, J.M. Pomeroy, J.N. Tan and J.D. Gillaspy. Spectra of W(39+)-W(47+) in the 12-20 nm region observed with an EBIT light source. J. Phys. B: At. Mol. Opt.Phys. 40 (2007) 3861-3875.

Media Contact: Michael Baum,, 301-975-2763

back to top

How to Protect Your Web Server from Attacks

The National Institute of Standards and Technology (NIST) has released a new publication that provides detailed tips on how to make web servers more resistant to potential attacks. Called “Guidelines on Securing Public Web Servers,” the publication covers some of the latest threats to web security, while reflecting general changes in web technology that have taken place since the first version of the guide was published 5 years ago.

Web servers are the software programs that make information available over the Internet. They are often the most frequently targeted hosts on a computer network. Attackers gaining unauthorized access to the server may be able to change information on the site (e.g., defacing a web page), access sensitive personal information, or install malicious software to launch further attacks. Recently emerging threats include “pharming,” in which people attempting to visit a web site are redirected surreptitiously to a malicious site.

How does one thwart these attacks? The guide advocates taking basic steps such as keeping up-to-date on patches (fixes and updates) for web server software and the underlying operating system. Also, the guide recommends configuring the software in as secure a fashion as possible, for example by disabling unnecessary software services and applications, which may themselves have security holes that can provide openings for attacks. Another key recommendation, especially for large-scale operations, is to consider the proper human-resource requirements for deploying and operating a secure web server, by staffing the appropriate complement of IT experts (such as system and network administrators) all doing their jobs to establish and promote security.

The guide advocates “defense in depth”—installing safeguards at various points of entry into the server, from the router that handles all incoming data traffic to the specific machines that house the server software. In addition, the guide recommends, organizations should monitor log files, create procedures for recovering from attacks, and regularly test the security of their systems.

The guide is designed for federal departments and agencies, but may be applicable to any web server to which the outside world has access. The guide is available free of charge at

Media Contact: Ben Stein,, 301-975-3097

back to top

Home Sprinklers Score ‘A’ in NIST Cost-Benefit Study

photo of sprinkler headSometimes life-saving technologies seem beyond the reach of the average person. If you put residential fire sprinklers in that category, think again. National Institute of Standards and Technology (NIST) economists ran the numbers. Their benefit-cost analysis found that for new home construction, a multipurpose network sprinkler system that connects to a house’s regular water supply and piping makes good economic sense.

NIST’s Benefit-Cost Analysis of Residential Fire Sprinkler Systems report, released last month, examines data from 2002 to 2005 to value the economic performance of a residential “wet-pipe” fire sprinkler system. The additional economic benefits from installation of a multipurpose network sprinkler system (the least costly wet-pipe system available) are estimated for three types of newly constructed single-family houses that are also equipped with smoke detectors. The study builds on a prior cost analysis developed by NIST’s Building and Fire Research Laboratory and offers a current analysis of the economics of residential fire sprinkler technology.

According to NIST, the cost in 2005 dollars for adding a multipurpose network sprinkler system to a house under construction was approximately $2,075 for a 3,338-square-foot colonial-style house, $1,895 for a 2,257-square-foot townhouse and $829 for a 1,171-square-foot ranch house. However when a house fire occurs, the estimated benefits of a residential fire sprinkler system include a 100 percent reduction in civilian fatalities and a 57 percent reduction in civilian injuries, a 32 percent reduction of both direct property damage (property losses that would not be covered by insurance) and indirect property costs (fire-related expenses such as temporary shelter, missed work, extra food costs, legal expenses, transportation, emotional counseling and childcare). Houses with sprinklers, in addition to smoke alarms, also received an 8 percent reduction in homeowner insurance premiums, over houses only equipped with smoke alarms.

After subtracting installation costs and weighting the benefits by the odds that a house would catch on fire, NIST economists concluded that, depending on assumptions, the net gain from installing a sprinkler system (in 2005 dollars) would vary between $704 and $4,801 for the colonial-style house, between $884 and $4,981 for the townhouse, and between $1,950 and $6,048 for the ranch-style house, over the 30-year study period. In all cases examined, the researchers found that the data supported the finding that multipurpose network residential fire sprinkler systems are cost-effective.

The United States Fire Administration (USFA), part of the Department of Homeland Security (DHS), funded the research.

Benefit-Cost Analysis of Residential Fire Sprinkler Systems (NISTIR 7451) by David T. Butry, M. Hayden Brown and Sieglinde K. Fuller can be downloaded at

Media Contact: John Blair,, 301-975-4261

back to top

Biopharmaceutical Infrastructure Key to Lower Drug Development Costs

Improvements to the technology infrastructure for researching and developing new biopharmaceuticals would be expected to save the industry hundreds of millions of dollars annually, according to a new economic study* sponsored by the National Institute of Standards and Technology (NIST).

biopharm economics chart
A new NIST-sponsored study found that the biopharmaceutical industry spends a total of $1,219 million annually on infrastructure technology—$884 million on the R&D technology infrastructure including bioimaging, biomarkers, informatics, and gene expression and $335 million on infrastructure for commercial manufacturing and postmarket surveillance.
Credit: NIST

Prepared by RTI International for NIST, the study’s authors found that over the last two decades emphasis in new drug development has shifted from small-molecule chemicals to large-molecule proteins and other biopharmaceuticals such as human insulin, gene therapies and specialized antibiotic treatments. The report notes that the biopharmaceutical industry currently spends about $21 billion annually on research and development and has commercialized over 400 products.

Producing and maintaining the infrastructure that supports R&D, manufacturing and postmarket surveillance, including core data, methods, and standards used to determine the quality and efficacy of biopharmaceuticals, costs the industry a total of $1.2 billion annually, according to the report. The study focused on expenditures for four major categories of technical infrastructure: bioimaging, biomarkers, bioinformatics, and gene expression, as well as expenditures for infrastructure supporting processing and quality control for commercial manufacturing and activities involved with postmarket surveilliance. (See chart.)

According to the study, improvements to this infrastructure, such as better standardization of data collection and analysis, would be expected to save between 25 and 48 percent of R&D expenses for each new biopharmaceutical drug approved by the Food and Drug Administration. Better technical infrastructure is also projected to reduce the average development time per approved drug from 122 months to 98 months, a reduction of 20 percent. The study further estimated that total industry manufacturing costs could be reduced over the four major phases of manufacturing by $1.5 billion or 23 percent.

Data for the study were gathered from individual researchers and organizations including a survey of 44 technical experts whose companies represent 42 percent of the combined annual R&D spending and 49 percent of the combined annual R&D sales in biopharmaceuticals.

The ultimate beneficiaries of an improved biopharmaceutical infrastructure, wrote the study’s authors, "are patients who gain access to a broader array of novel therapies where development is supported by an effective technology infrastructure."

The full report, Economic Analysis of the Technology Infrastructure Needs of the U.S. Biopharmaceutical Industry, is available at:

* RTI International, Economic Analysis of the Technology Infrastructure Needs of the U.S. Biopharmaceutical Industry: Planning Report 07-01, August 2007, 201 pp.

Media Contact: Gail Porter,, 301-975-3392

back to top


NIST Official Testifies on Fire Research Program

On Oct. 2, 2007, S. Shyam Sunder, director of the Building and Fire Research Laboratory at the National Institute of Standards and Technology (NIST), testified before the Subcommitte on Technology and Innovation of the House Science and Technology Committee on NIST’s fire research program.

Fires, Sunder said, remain a serious national problem. Each year, more than 3,000 people die in fires and per capita fire deaths are 70 percent higher in the United States than in the European Union. NIST’s long-standing fire research program is focused on reducing losses and risk by advancing innovative fire protection technologies, increasing the safety of buildings threatened by fire, and better understanding the process of building fires. New building materials, construction practices and building designs all potentially can affect fire safety and the time available for safely exiting buildings. In addition to improving fire safety in buildings, NIST research provides the science and performance measures that are critical for developing and implementing the new technologies necessary to improve the effectiveness and safety of emergency responders. Follow this link to read Sunder’s testimony.

For some recent stories on NIST fire research, see “NIST Test Fans the Flames for High-Rise Fire Safety”, “NIST Begins Technical Study of S.C. Warehouse Fire”, “Fire Tests Examine Structural Collapse Hazards and Warning Devices” and “NIST Firebrand Device Could Save U.S. and Japanese Homes.”

Media Contact: Michael Baum,, 301-975-2763

back to top

Celebrating the Physics Nobel Prize Winners

This week scientists at the National Institute of Standards and Technology (NIST) have joined colleagues around the world in congratulating Albert Fert of France and Peter Grünberg of Germany for winning this year’s Nobel Prize in Physics. The pair won for their independent discovery of giant magnetoresistance (GMR), the phenomenon used for reading data on today’s high-capacity magnetic disk drives.

NIST has numerous connections to the Nobel Prize winning work and to the study of GMR in general. The Nobel Prize background material cites work done at the NIST Center for Neutron Research (NCNR) by research groups that included Charles Majkrzak, Julie Borchers and Ross Erwin, all of whom are now part of NIST staff. Then as now, the neutron work takes advantage of the ability of neutrons to determine the detailed microscopic properties of magnetic materials. These scientists and other NIST researchers (including Boulder’s Pavel Kabos) collaborated directly with Grünberg on more than seven joint publications.

In addition, many other NIST researchers have advanced the GMR field. Follow this link to read about some of them.

Media Contact: Michael Baum,, 301-975-2763

back to top