NIST logo

Tech Beat - June 25, 2013

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: June 25, 2013
Date Modified: June 25, 2013 
Contact: inquiries@nist.gov

NIST Announces Plan to Create Center of Excellence for Advanced Materials Research

polymer glass simulation
Soft materials such as complex polymers pose particular problems for theory and modeling. This molecular dynamics simulation done at NIST shows interactions in a polymer mixture as it solidifies. The work has applications in fields as diverse as protein preservation, batteries and polymer nanocomposites.
Credit: Douglas/NIST
View hi-resolution image

The National Institute of Standards and Technology (NIST) today announced that it plans to establish a new Advanced Materials Center of Excellence to facilitate collaborations between NIST and researchers from academia and industry on advanced materials development.

The planned center, which NIST expects to fund at approximately $25 million over five years, will emphasize innovations in measurement technology, modeling, simulation, and data and informatics tools related to advanced materials. NIST plans to hold a competition this summer to select an organization to host the new center.

The center will play a major role in NIST's support for the President's Materials Genome Initiative (MGI), which was launched two years ago.* The MGI is a multiagency effort in materials research with a goal of cutting development times in half while reducing the cost of discovering, engineering and deploying new advanced materials.

Typically, it takes 10 to 20 years for a new material to go from initial research to first use. Advanced materials and products, from the tough new glasses used for smartphone screens to new biomaterials to repair damaged tissues and organs in the body, are critical for the United States to get to market first in order to fuel innovation, create jobs and spur economic growth.

The President's Office of Science and Technology Policy (OSTP) today announced a series of new initiatives in support of the MGI, including the planned NIST center of excellence. See: "Two Years Later, Bold New Steps for the Materials Genome Initiative" at www.whitehouse.gov/blog/2013/06/24/two-years-later-bold-new-steps-materials-genome-initiative.

* For more on the Materials Genome Initiative, see www.whitehouse.gov/mgi.

 

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

Time to Celebrate: NIST Radio Station WWVB Logs 50 Years of Innovation

Fifty years ago on July 5, 1963, a modest radio station in Fort Collins, Colo., officially went on the airwaves—a landmark event for U.S. industry and the American public.

wwvb 1963
Then: Engineer David Andrews and technician Robert Oase are shown by the WWVB transmitter in 1963. Oase is relaying instructions to an engineer in a different location tuning the antenna.
Credit: NIST
View hi-resolution image
Deutch, Sutton, Nelson, and Yate
Now: Engineer Matthew Deutch and technicians Douglas Sutton, Glenn Nelson, and Bill Yates (left to right) are shown with one of three WWVB transmitters as it broadcasts the station's signal in 2013.
Credit: NIST
View hi-resolution image

Operated by the National Institute of Standards and Technology (NIST), WWVB is best known today for broadcasting the time to an estimated 50 million radio-controlled clock radios, wall clocks, wristwatches and other timekeeping devices across the U.S. mainland. But back in 1963, the station had an entirely different audience, broadcasting standard frequencies at the high accuracy needed by satellite and missile programs. The time signal, added two years later, then became a popular means of synchronizing power plants to prevent brownouts, and coordinating analog telephone networks.

"WWVB was one of the original time providers that made an impact on industry," says John Lowe, leader of NIST's Time and Frequency Services Group. "The signal's low frequency helped with stability and propagated very well at night."

WWVB's time signals are accurate because they are synchronized to NIST's atomic clock ensemble just 80 kilometers (about 50 miles) down the road in Boulder, Colo. This ensemble is calibrated by the NIST-F1 cesium fountain atomic clock, the U.S. civilian time standard.

WWVB was not NIST's first foray into radio. This year marks the 90th anniversary of NIST's high-frequency radio station WWV, which shares the Fort Collins site. WWV began experimental broadcasts in Washington, D.C., in 1920 and continuous operations in 1923, and moved to Fort Collins in 1966.*

wwvb antennas 1963
This 1963 photo of NIST's then-new radio station in Fort Collins, Colo., was taken from one of four towers supporting an antenna. The small building in the foreground is the helix house, which contained equipment to electrically match the station's transmitter to the antenna.
Credit: NIST
View hi-resolution image
wwvb towers 2004
Modern photo of NIST radio station WWVB. The transmitters are in the building. In the background are WWVB's four north towers supporting the antenna in the center. The cables are most visible under dark, stormy skies like this.
Credit: NIST
View hi-resolution image

But WWVB is unique. Its legacy of innovation began with its unusually low operating frequency of 60 kilohertz (kHz, or cycles per second). The wavelength is 5 kilometers long, more than 3 miles. This concept dates back to 1956, when the then-new NIST Boulder campus began operating 60 kHz radio station KK2XEI a few hours a day. The station radiated less than 2 watts of power but proved that low frequencies are extremely stable, with the ground below and ionosphere above forming a huge duct to guide signals around the curvature of the Earth. In 1963, WWVB's received signal was 100 times more stable than WWV's. (NIST experimented with even lower frequencies with standard radio broadcast station WWVL from 1960 to 1972.)

WWVB got its call sign in 1960; the B probably stands for Boulder. The station was built in Fort Collins because it is near Boulder but farther from the mountains, making it easier to broadcast an omnidirectional signal. The site is also attractive for its high ground conductivity, due to highly alkaline soil.

WWVB began official broadcasts at 4 thousand watts (4 kW). The power level was later increased to 13 kW, where it remained for many years. Then in the 1980s, precision timing signals from satellites began providing an alternative for many of industry's timekeeping needs. In 1999, WWVB's signal was boosted to 50 kW so it could reach the entire U.S. mainland and be picked up by small antennas. This led to mass production and wide use of radio-controlled timepieces, giving consumers broad access to atomically precise NIST time.**

Now with a power level of 70 kW, WWVB's latest innovation is a modified time code format,*** which Lowe says boosts reception capability 100-fold and will encourage the inclusion of radio-controlled timing in a broader range of products. For example, appliances such as refrigerators, microwave ovens and thermostats, motor vehicles, as well as traffic light timers and lawn sprinkler systems, could now receive atomic time over the airwaves.

"This will get us beyond the novelty of clocks and watches and get the time signal to original equipment manufacturers, and get it into cars, refrigerators and sprinkler systems," Lowe says. "My hope is that WWVB will become the source for time in a wide range of products and have a new role to play in our society."

"We look forward to another 50 years."

*WWV provides high-frequency time and frequency signals, geophysical alerts and marine storm warnings to users, including amateur radio operators, stopwatch and timer calibration laboratories, piano tuners, and telephone callers wishing to manually set watches and clocks. See www.nist.gov/pml/div688/grp40/wwv.cfm .
**For information about how radio-controlled clocks work and what to do when they don't work, see www.nist.gov/pml/div688/grp40/radioclocks.cfm.
***See 2013 NIST Tech Beat article, "New NIST Time Code to Boost Reception for Radio-Controlled Clocks," at www.nist.gov/pml/div688/wwvb-030513.cfm.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

Microscopy Technique Could Help Computer Industry Develop 3-D Components

A technique developed several years ago at the National Institute of Standards and Technology (NIST) for improving optical microscopes now has been applied to monitoring the next generation of computer chip circuit components, potentially providing the semiconductor industry with a crucial tool for improving chips for the next decade or more.

intel22nm transistor
These three-dimensional tri-gate (FinFET) transistors are among the 3-D microchip structures that could be measured using through-focus scanning optical microscopy (TSOM).
Courtesy Intel Corporation
View hi-resolution image

The technique, called Through-Focus Scanning Optical Microscopy (TSOM), has now been shown able to detect tiny differences in the three-dimensional shapes of circuit components, which until very recently have been essentially two-dimensional objects. TSOM is sensitive to features that are as small as 10 nanometers (nm) across, perhaps smaller—addressing some important industry measurement challenges for the near future for manufacturing process control and helping maintain the viability of optical microscopy in electronics manufacturing.

For decades, computer chips have resembled city maps in which components are essentially flat. But as designers strive to pack more components onto chips, they have reached the same conclusion as city planners: The only direction left to build is upwards. New generations of chips feature 3-D structures that stack components atop one another, but ensuring these components are all made to the right shapes and sizes requires a whole new dimension—literally—of measurement capability.

"Previously, all we needed to do was show we could accurately measure the width of a line a certain number of nanometers across," explains NIST's Ravikiran Attota. "Now, we will need to measure all sides of a three-dimensional structure that has more nooks and crannies than many modern buildings. And the nature of light makes that difficult."

Part of the trouble is that components now are growing so small that a light beam can't quite get at them. Optical microscopes are normally limited to features larger than about half the wavelength of the light used—about 250 nanometers for green light. So microscopists have worked around the issue by lining up a bunch of identical components at regular distances apart and observing how light scatters off the group and fitting the data with optical models to determine the dimensions. But these optical measurements, as currently used in manufacturing, have great difficulty measuring newer 3-D structures.

Other non-optical methods of imaging such as scanning probe microscopy are expensive and slow, so the NIST team decided to test the abilities of TSOM (http://www.nist.gov/pml/div683/nanoscale_072110.cfm), a technique that Attota played a major role in developing. The method uses a conventional optical microscope, but rather than taking a single image, it collects 2-D images at different focal positions forming a 3-D data space. A computer then extracts brightness profiles from these multiple out-of-focus images and uses the differences between them to construct the TSOM image. The TSOM images it provides are somewhat abstract, but the differences between them are still clear enough to infer minute shape differences in the measured structures—bypassing the use of optical models, which introduce complexities that industry must face.

"Our simulation studies show that TSOM might measure features as small as 10 nm or smaller, which would be enough for the semiconductor industry for another decade," Attota says. "And we can look at anything with TSOM, not just circuits. It could become useful to any field where 3-D shape analysis of tiny objects is needed."

*R. Attota, B. Bunday and V. Vartanian. Critical dimension metrology by through-focus scanning optical microscopy beyond the 22 nm node. Applied Physics Letters, DOI: 10.1063/1.4809512, published online June 6, 2013.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

New NIST Test for Firefighter Breathing Equipment Goes into Effect Sept. 1

As of Sept. 1, 2013, standard firefighter breathing equipment cannot be certified to National Fire Protection Association (NFPA) standards unless the facepiece lenses pass a new rigorous test developed by the National Institute of Standards and Technology (NIST).

warped SCBA lens
Typical fire-fighter breathing apparatus damaged in NIST tests shows facepiece warping under high heats.
Credit: NIST
View hi-resolution image

The NIST-developed test is designed to reduce the degradation and possible failure of the facepiece lens in self-contained breathing apparatus (SCBA) under high-heat firefighting conditions. NFPA incorporated the NIST test into the 2013 update of its standard for SCBA units.*

The 2013 version of NFPA's 1981 standard, published in January, 2013, contains a new "Elevated Temperature Heat and Flame Resistance Test" that exposes the SCBA to 500 °F (260 °C) for 5 minutes in an oven. This test is followed by 10 seconds of direct flame contact.

In addition, the new version contains a new "Lens Radiant Heat Test" that subjects the SCBA facepieces to a radiant heat flux of 15 kilowatts per square meter (kW/m2) for five minutes. As part of this test, the facepiece is required to maintain an air supply (positive pressure) inside the mask for a total of 24 minutes.

In controlled experiments,** NIST researchers determined that an incident radiant heat flux of 15 kW/m2 would be a representative test criterion for determining the performance of SCBA facepiece lenses. It is representative of the flux experienced by firefighters approaching the onset of the deadly phenomenon known as flashover, a state of total surface involvement in a fire of combustible material within an enclosure. Also, the researchers found that measuring internal facepiece pressure was indicative of when holes formed and the effect of holes on firefighter air-supply duration and breathing protection.

The new test and test conditions are important advances in improving the performance of what has been, perhaps, the most vulnerable component of a firefighter's protective gear in high-heat conditions. Failure of a lens can expose a firefighter to toxic gases and can result in burns to the respiratory tract as well as asphyxiation. Documented problems include holes and extensive crazing as well as bubbling and deforming of lenses.

In several SCBA-related deaths, degraded masks were found affixed to the faces of victims who suffered thermal burns to their airways.

In the United States, SCBA makers submit their products for certification testing before they are sold. Until August 31, 2013, compliance to NFPA standards only requires passing a less severe "heat and flame test," specified in the 2007 version of NFPA 1981 and retained, with the new NIST test, in the recent update.

NIST experiments conducted during development of the new facepiece-lens test were supported, in part, by the Department of Homeland Security Science and Technology Directorate and the United States Fire Administration. The National Institute for Occupational Safety and Health (NIOSH) Fire Fighter Fatality and Injury Prevention Program played a critical role in identifying the lens degradation issue.

*NFPA 1981: Standard on Open-Circuit Self-Contained Breathing Apparatus (SCBA) for Emergency Services (2013 edition). See: www.nfpa.org/aboutthecodes/AboutTheCodes.asp?DocNum=1981#sthash.77MPZJ1P.dpuf

**See the 2012 NIST Tech Beat story "NIST Study of Hazard to Firefighters Leads to Safety Alert " at www.nist.gov/public_affairs/tech-beat/tb20120725.cfm#nfpa. The technical report, Thermal Performance of Self-Contained Breathing Apparatus Facepiece Lenses Exposed to Radiant Heat Flux (NIST Technical Note 1785), Feb. 2013, by A. Putorti, Jr.; A. Mensch; N. Bryner and G. Braga, is available at: www.nist.gov/manuscript-publication-search.cfm?pub_id=912504.

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

Comment  Comment on this article.back to top

New Quantum Dot Technique Combines Best of Optical and Electron Microscopy

It's not reruns of "The Jetsons", but researchers working at the National Institute of Standards and Technology (NIST) have developed a new microscopy technique that uses a process similar to how an old tube television produces a picture—cathodoluminescence—to image nanoscale features. Combining the best features of optical and scanning electron microscopy, the fast, versatile, and high-resolution technique allows scientists to view surface and subsurface features potentially as small as 10 nanometers in size.

q-ebic
Much like in an old tube television where a beam of electrons moves over a phosphor screen to create images, the new microscopy technique works by scanning a beam of electrons over a sample that has been coated with specially engineered quantum dots. The dots absorb the energy and emit it as visible light that interacts with the sample at close range. The scattered photons are collected using a similarly closely placed photodetector (not depicted), allowing an image to be constructed.
Credit: Dill/NIST
View hi-resolution image

The new microscopy technique, described in the journal AIP Advances,* uses a beam of electrons to excite a specially engineered array of quantum dots, causing them to emit low-energy visible light very close to the surface of the sample, exploiting so-called "near-field" effects of light. By correlating the local effects of this emitted light with the position of the electron beam, spatial images of these effects can be reconstructed with nanometer-scale resolution.

The technique neatly evades two problems in nanoscale microscopy, the diffraction limit that restricts conventional optical microscopes to resolutions no better than about half the wavelength of the light (so about 250 nm for green light), and the relatively high energies and sample preparation requirements of electron microscopy that are destructive to fragile specimens like tissue.

NIST researcher Nikolai Zhitenev, a co-developer of the technique, had the idea a few years ago to use a phosphor coating to produce light for near-field optical imaging, but at the time, no phosphor was available that was thin enough. Thick phosphors cause the light to diverge, severely limiting the image resolution. This changed when the NIST researchers teamed with researchers from a company that builds highly engineered and optimized quantum dots for lighting applications. The quantum dots potentially could do the same job as a phosphor, and be applied in a coating both homogenous and thick enough to absorb the entire electron beam while also sufficiently thin so that the light produced does not have to travel far to the sample.

The collaborative effort found that the quantum dots, which have a unique core-shell design, efficiently produced low-energy photons in the visible spectrum when energized with a beam of electrons. A potential thin-film light source in hand, the group developed a deposition process to bind them to specimens as a film with a controlled thickness of approximately 50 nm.

Much like in an old tube television where a beam of electrons moves over a phosphor screen to create images, the new technique works by scanning a beam of electrons over a sample that has been coated with the quantum dots. The dots absorb the electrons' energy and emit it as visible light that interacts with and penetrates the surface over which it has been coated. After interacting with the sample, the scattered photons are collected using a closely placed photodetector, allowing an image to be constructed. The first demonstration of the technique was used to image the natural nanostructure of the photodetector itself. Because both the light source and detector are so close to the sample, the diffraction limit doesn't apply, and much smaller objects can be imaged.

"Initially, our research was driven by our desire to study how inhomogeneities in the structure of polycrystalline photovoltaics could affect the conversion of sunlight to electricity and how these devices can be improved," says Heayoung Yoon, the lead author of the paper. "But we quickly realized that this technique could also be adapted to other research regimes, most notably imaging for biological and cellular samples, wet samples, samples with rough surfaces, as well as organic photovoltaics. We are anxious to make this technique available to the wider research community and see the results."

This work was a collaboration among researchers from NIST; the Maryland NanoCenter at the University of Maryland, College Park; Worcester Polytechnic Institute; QD Vision; and Sandia National Laboratories.

* H. Yoon, Y, Lee, C. Bohn, S. Ko, A. Gianfrancesco, J. Steckel, S. Coe-Sullivan, A. Talin and N. Zhitenev. High-resolution photocurrent microscopy using near-field cathodoluminescence of quantum dots. AIP Advances. Published online 10 June 2013.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

High-Efficiency Photon Source Brings Day of Reckoning Closer for a Famous Quantum Test

Physicists working at the National Institute of Standards and Technology (NIST) and the Joint Quantum Institute (JQI) are edging ever closer to getting really random.

Alan Migdall
NIST physicist Alan Migdall spotlights the crystal that converts one photon to two in the middle of the complex apparatus used as a high-efficiency source of paired photon for quantum experiments.
Credit: Baum/NIST

Their work—a source that provides the most efficient delivery of a particularly useful sort of paired photons yet reported*—sounds prosaic enough, but it represents a new high-water mark in a long-term effort toward two very different and important goals, a definitive test of a key feature of quantum theory and improved security for Internet transactions.

The quantum experiment at the end of the rainbow is an iron-clad test of Bell's inequality. The Irish physicist John Stewart Bell first proposed it in 1964 to resolve conflicting interpretations of one of the stranger parts of quantum theory. Theory seems to say that two "entangled" objects such as photons must respond to certain measurements such as their direction of polarization in a way that implies that each knows instantaneously what happens to the other. Even if they're so far apart that that information would have to travel faster than the speed of light.

One possible explanation is the so-called "hidden variable" idea. Maybe this behavior is somehow wired into the photons at the moment they're created; the results of our experiments are coordinated behind the scenes by some unknown extra process. Bell, in a clever bit of mathematical logic, pointed out that were this so, evidence of the unknown variable would show up statistically in our measurements of such entangled pairs—Bell's inequality.

"Simply put," explains NIST physicist Alan Migdall, "we determine the values for each outcome according to this relation and add them up. It says we can't get—in this version—anything greater than two. Except, we do."

The experiment has been run over and over again for years, and the violation of Bell's inequality should put hidden variables to rest, but for the fact that no real-world experiment is perfect. That's the source of the Bell experiment "loopholes."

"The theorists are very clever and they say, okay, you got a number greater than two but your detectors weren't a hundred percent efficient, so that means you're not measuring all the photons," says Migdall, "Maybe there's some system that has some way of controlling which ones you're going to get, and the subset you detect gives that result. But if you had measured all of the photons you would not have gotten a number greater than two."

That's the experimentalists' challenge: to build systems so tight, so loss-free, that the loopholes are mathematically impossible. Theorists calculate that closing the loophole requires an experimental system with at least 83 percent efficiency—no more than 17 percent photon loss.

"It's really hard to do this," says Migdall, "because it's not just the detector, it's everything from where the photons are created to where they're detected. Any loss in the optics is part of that." Fortunately, the team has a 99 percent efficient detector that was developed by NIST physicist Sae Woo Nam in 2010.**

Their latest work addresses that optical part of the chain. The team has (carefully) built a source of paired, single-mode photons with a symmetric efficiency of 84 percent. Single-mode means the photons are restricted to the simplest spatial beam shape required for a variety of quantum information applications. Symmetric means your chances of detecting either photon of the pair are equally good. The photon pairs here are not the entangled pairs of a Bell test, but rather a "heralding" system where the detection of one photon tells you that the other exists, and so can be measured. You'd need two of these systems, merged together without efficiency loss, to run a loophole-free Bell test. But it's a big step on the way.

It's also a big step for an entirely different problem, a trusted source of truly random numbers, says Migdall. Random numbers are essential to encryption and identity verification systems on the Internet, but true randomness is hard to come by. Quantum processes of the same sort tested by Bell's inequality would be one source of absolutely guaranteed, iron-clad randomness—if there is no hidden variable that makes them not random, and equally important, the no-hidden-variable means that the random number did not exist before you measured it, which makes it pretty hard for a spy or a cheater to steal.

"Nobody in the physics world believes we'll get a surprising result, but in the security world you have to believe that everything out there is malicious, that Mother Nature is malicious," says Migdall, "We have to prove quantum theory to those guys."

For more on that and the NIST random number "beacon," see "Truly Random Numbers — But Not by Chance" at www.nist.gov/pml/div684/random_numbers_bell_test.cfm. The Joint Quantum Institute is a collaboration of NIST, the University of Maryland and the Laboratory for Physical Sciences at UMD.

*M. Da Cunha Pereira, F.E Becerra, B.L Glebov, J. Fan, S.W. Nam and A. Migdall. Demonstrating highly symmetric single-mode, single-photon heralding efficiency in spontaneous parametric downconversion. Optics Letters, Vol. 38, Issue 10, pp. 1609-1611 (2013) DOI: 10.1364/OL.38.001609
**See the 2010 NIST news story, "NIST Detector Counts Photons With 99 Percent Efficiency" at www.nist.gov/pml/div686/detector_041310.cfm.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

Revised Security Guidelines Provide Strategy for Government Mobile Device Management

The National Institute of Standards and Technology (NIST) has published a mobile device management guide for federal agencies seeking secure methods for workers to use smart phones and tablets.

Employees want to be connected to work through mobile devices for flexibility and efficiency, and managers can appreciate that. However, the technology that delivers these advantages also provides challenges to an agency's security team because these devices can be more vulnerable.

For example, a smartphone or tablet could be stolen or lost, potentially allowing unauthorized individuals access to an agency's network and its sensitive information. Also, an employee can unknowingly infect its agency's network by downloading an application containing malware.

Guidelines for Managing the Security of Mobile Devices in the Enterprise helps federal agencies and other organizations struggling with this dilemma. Originally issued in 2008 as Guidelines on Cell Phone and PDA Security, the new guidelines have been extensively updated and reflect comments received on a draft version issued a year ago.

The revised guidelines recommend using centralized device management at the organization level to secure both agency-issued and individually owned devices used for government business.

Centralized programs manage the configuration and security of mobile devices and provide secure access to an organization's computer networks. Many agencies currently use this type of system to manage the smartphones they issue to staff. The new NIST guidelines offer recommendations for selecting, implementing and using centralized management technologies for securing mobile devices.

Other key recommendations include instituting a mobile device security policy, implementing and testing a prototype of the mobile device solution before putting it into production, securing each organization-issued mobile device before allowing a user to access it, and maintaining mobile device security.

Guidelines for Managing the Security of Mobile Devices in the Enterprise (NIST Special Publication 800-124 Revision 1) is available at www.nist.gov/manuscript-publication-search.cfm?pub_id=913427.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

Three Presidential Innovation Fellows Begin Work at NIST

Three of this year's Presidential Innovation Fellows (PIF) began work at the National Institute of Standards and Technology (NIST) on June 18, 2013, helping the institute address issues relating to connecting networks of machines, facilities and people, and also to help energy consumers make better use of their energy usage data.

Presidential Innovation Fellows
NIST's Presidential Innovation Fellows are (L-R) Sokwoo Rhee and Geoff Mulligan, who will work on cyber physical systems, and John Teeter (far right), who will work on the Green Button Initiative. The three fellows, part of the second round of the PIF program, started their 12-month tours at NIST on June 18, 2013.
Credit: Greb/NIST
View hi-resolution image

The PIF program pairs top innovators from the private sector, nonprofits and academia with top innovators in government to work on challenges whose solutions could provide immediate benefits and cost-savings to American citizens, entrepreneurs and businesses. The three NIST fellows, who were selected from a highly competitive pool of applicants, will serve a 12-month "tour of duty."

Two of NIST's fellows, Geoff Mulligan and Sokwoo Rhee, will work on cyber-physical systems (CPS). These new "smart systems" will combine networking, information and communication technologies to optimize system performance in real time. Accelerated development of CPS is expected to yield new manufactured products and services in a range of industry sectors essential to U.S. competitiveness, including manufacturing, transportation, energy, health care and defense. NIST's fellows will lead a multidisciplinary team of experts from the private and public sectors and work for consensus in developing a framework consisting of high-level reference architecture, standards and protocols for CPS.*

Mulligan has been instrumental in Internet development, recently having helped design IPv6, the latest version of the communications protocol that routes traffic across the Internet. He holds over 15 patents in computer security, networking and electronic mail, and was called to testify before Congress on electronic commerce and computer security.

Rhee is an entrepreneur who helped to initiate the cyber-physical systems and "internet of things" industry in the early 2000s. His work and achievements have been recognized by multiple awards including MIT Technology Review's Top Innovators under 35.

NIST's third fellow, John Teeter, will work on the Green Button Initiative,** which aims to enable energy customers to download their energy usage data securely in machine-readable format directly from utilities. He will be part of a "Green Button for America" team, the other member of which will work from the Department of Energy. A key goal will be to provide leadership to enhance Green Button data consistency through testing and certification, based on feedback from consumers, vendors and utilities.

Teeter has a 40-year history in the electrical and technology industries, including most recently as chief scientist for People Power, a software company enabling remote control and management of connected devices from mobile devices. Teeter was the founder and CEO of First Step Research, prior to which he was a founder and VP of Engineering at Gold Hill Computers.

The Office of Science and Technology Policy (OSTP) runs the PIF program, now in its second year, and asked agencies across the government to propose new research areas to expand the PIF program for this year. NIST's two proposals were among those selected, bringing the number of project areas up to nine from last year's five. Fellows are funded by the sponsoring agencies.

More information is available in the White House blog post, "New Round of Innovators Joins US Government to Tackle Big Challenges" at www.whitehouse.gov/blog/2013/06/24/new-round-innovators-joins-us-government-tackle-big-challenges.

* For more on NIST work in cyber-physical systems, see www.nist.gov/el/smartcyber.cfm.
** For more on NIST work on the Green Button Initiative, see www.nist.gov/smartgrid/greenbutton.cfm.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

NIST and NTIA Announce Plans to Establish New Center for Advanced Communications

The National Institute of Standards and Technology (NIST) and the National Telecommunications and Information Administration (NTIA) have announced plans to establish a national Center for Advanced Communications in Boulder, Colo. The new center will implement a key provision of a memorandum President Obama issued on “Expanding America’s Leadership in Wireless Innovation.”

The two agencies recently signed a Memorandum of Understanding (MOU) to collaborate on the establishment of the center. The MOU states that the center will leverage the “critical mass of NIST and NTIA research and engineering capabilities concentrated in Boulder” to form a “unique national asset,” and includes the infrastructure and collaborative environment needed to address a wide range of advanced communications challenges. This joint effort will increase the impact of existing efforts already under way in both agencies.

“Advanced communication technologies drive product development in telecommunications, IT, energy, and many other critical economic sectors. This new center will combine NIST’s and NTIA’s research and technology support for U.S. industry so that it can rapidly evaluate and exploit exciting new opportunities in the field,” said Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher.

Read more at www.nist.gov/public_affairs/releases/nist-ntia-mou-061413.cfm.

Media Contact: Gail Porter, gail.porter@nist.gov, 301-975-3392

Comment  Comment on this article.back to top

NIST Announces New Scaffold Reference Material for Tissue Engineering Research

The National Institute of Standards and Technology (NIST) has issued a new reference material—a sort of standardized sample—of cellular scaffolds for use in tissue engineering research.

test culture of bone cells
Fluorescense micrographs of a test culture of bone cells after one day shows the cells proliferating on the struts of new NIST reference scaffold for tissue engineering.
Credit: NIST
View hi-resolution image

Growing custom replacement tissue—cartilage, bone, blood vessels, possibly even whole organs—is one of the hot research fields in modern medicine. If the techniques could be perfected, it might become possible to grow transplant materials for patients based on their own cells, avoiding problems with compatibility, immune response and tissue rejection.

Scaffolds—biologically innocuous materials that give developing cells a three-dimensional structural template on which to grow—are important to this process. Considerable research centers on determining the best scaffold materials and designs to encourage the growth of various sorts of tissue. The new NIST reference material is designed as a common ground for research labs, a well-understood, uniform scaffold that can be used as a baseline or control in experiments that measure factors such as cell adhesion and proliferation.

Each unit of NIST Reference Material 8394, “Tissue Engineering Reference Scaffolds for Cell Culture,” includes 24 “free-form” scaffolds made of PCL* in a standard 96-well plate. Each scaffold has six layers of PCL struts laid down in a crisscross pattern. NIST provides reference values for the diameter, spacing and porosity of the struts, as well as adhesion and proliferation of osteoblasts (bone cells).

For more information on RM 8394, go to https://www-s.nist.gov/srmors/view_detail.cfm?srm=8394.

Standard reference materials are among the most widely distributed and used products from NIST. The agency prepares, analyzes and distributes about 1,300 different materials that are used throughout the world to check the accuracy of instruments, validate test procedures and serve as the basis for quality assurance worldwide. NIST reference materials are considered by NIST to be sufficiently homogeneous and stable with respect to one or more properties to be useful for measurement purposes, though they do not meet the more stringent requirements for a standard reference material.**

*poly(ε-caprolactone).
**For more on the distinctions between “reference materials” and “standard reference materials,” see www.nist.gov/srm/definitions.cfm.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

NIST Office of Weights and Measures Receives Important ‘Continuing Education’ Accreditation

The International Association for Continuing Education and Training (IACET) has awarded the National Institute of Standards and Technology (NIST) Office of Weights and Measures (OWM) an "Authorized Provider" accreditation. IACET Authorized Providers are the only organizations approved to offer IACET Continuing Education Units (CEUs), which certify that IACET has evaluated the NIST OWM training program and found it to be compliant with internationally accepted standards.

Instructors perform measurements during dry run of Fundamentals of Metrology course
Instructors José Torres and Phil Wright perform measurements during a dry run of the new Fundamentals of Metrology course being offered by the NIST Office of Weights and Measures.
credit: Gentry/NIST
High-resolution image

Many states require that their weights and measures officials receive training throughout their careers. Using an accredited training organization gives those officials confidence that the training they will receive is of high quality.

"We’re proud of our education programs, which train hundreds of metrologists and state and local weights and measures officials each year in important legal metrology skills so that the U.S. weights and measures system stays on the cutting edge," says Carol Hockert, chief of the Office of Weights and Measures at NIST. "Our new partnership with IACET is a demonstration of our commitment to lifelong learning and high standards for all of our programs."

The accreditation period extends for five years and includes courses offered or created that follow OWM procedures during that time. With this accreditation, the NIST OWM joins nearly 650 organizations around the globe that have had their programs vetted by third-party experts in continuing education to ensure the highest possible standards are met.

The NIST OWM analyzes weights and measures training needs, obtains input from the weights and measures community, designs and delivers training for laboratory metrologists and weights and measures officials, measures the impact and effectiveness of training to ensure ongoing continual improvement, and consults with the weights and measures community to ensure ongoing professional development.

In order to achieve Authorized Provider accreditation, NIST OWM completed a rigorous application process, including a review by an IACET site visitor, and successfully demonstrated adherence to the ANSI/IACET 1-2007 Standard addressing the design, development, administration and evaluation of its training program.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

NIST Seeks Private-Sector Input at Cybersecurity Framework Workshop

The National Institute of Standards and Technology (NIST) has opened registration for its 3rd Cybersecurity Framework Workshop, to be held July 10-12, 2013, in San Diego, Calif.

Geisel Library at UCSD
The 3rd Cybersecurity Framework Workshop will bring together representatives from critical infrastructure industries to develop a framework of standards, guidance and practices companies can use to improve cybersecurity. The meeting will be held at the University of California, San Diego. Shown here, UCSD's Geisel Library
credit: Erik Jepsen/UC San Diego Publications
view high resolution version

Executive Order 13636,* "Improving Critical Infrastructure Cybersecurity," gave NIST the responsibility to work with industry to develop a voluntary "framework"—incorporating existing standards, guidelines and best practices—that institutions could use to reduce the risk of cyber attacks. Critical infrastructure includes those industries vital to the nation's economy, security and health such as finance, energy, transportation, food and agriculture and health care.

More than 700 people attended NIST's first two workshops, in Washington, D.C., and Pittsburgh, with more than 2,500 people participating online. The workshops aim to bring together a broad set of participants from critical infrastructure owners and operators, industry associations, standards development organizations, individual companies and government agencies. The goal is to maximize private-sector input in developing the framework. Participants will be expected to actively assist in the framework development process through hands-on participation in breakout sessions.

"We're holding these workshops in different parts of the country, but our focus is on the nation's critical infrastructure," said project leader Adam Sedgewick. "We have received considerable input already, but we look forward to hearing from both new industry representatives and those who are already engaged with the framework development. We will provide a draft outline and describe the approach of the framework, so it is a crucial time for all relevant industries to be involved to help us fill in the gaps and produce a framework that will be effective and widely used on a voluntary basis."

NIST expects the third workshop to result in a more detailed draft of the Cybersecurity Framework and a corresponding list of current standards, guidelines and practices, as well as important gaps. A final workshop is being planned for September 2013, after which NIST will release the official preliminary framework for public comment. According to the executive order, the final framework must be completed in February 2014.

To register to attend the workshop, visit www.nist.gov/itl/csd/3rd-cybersecurity-framework-workshop-july-10-12-2013-san-diego-ca.cfm. The event is being hosted by the University of California, San Diego, and the National Health Information Sharing and Analysis Center.

Attendees should review the outline of the draft framework in advance. It is expected to be available at www.nist.gov/itl/cyberframework.cfm by the end of June; registrants will be notified when the draft is posted. That site includes details on the framework development process such as links to comments received through a Request for Information, transcripts and video from the previous workshops and information on future workshops.

*"Improving Critical Infrastructure Cybersecurity" is available at www.gpo.gov/fdsys/pkg/FR-2013-02-19/pdf/2013-03915.pdf.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top