NIST logo

Tech Beat - July 20, 2007

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: August 13, 2012
Date Modified: August 13, 2012 
Contact: inquiries@nist.gov

Magnetic Tape Analysis "Sees" Tampering in Detail

The National Institute of Standards and Technology (NIST) has developed an improved version of a real-time magnetic microscopy system that converts evidence of tampering on magnetic audio and video tapes—erasing, overdubbing and other alterations—into images with four times the resolution previously available. This system is much faster than conventional manual analysis and offers the additional benefit of reduced risk of contaminating the tapes with magnetic powder. NIST recently delivered these new capabilities to the Federal Bureau of Investigation (FBI) for validation as a forensic tool.

Earlier versions of this system made images with a resolution of about 400 dots per inch (dpi). www.nist.gov/public_affairs/releases/tape_analysis.htm. The new system uses four times as many magnetic sensors, 256, embedded on a NIST-made silicon chip that serves as a read head in a modified cassette tape deck. The NIST read head operates adjacent to a standard read head, enabling investigators to listen to a tape while simultaneously viewing the magnetic patterns on a computer monitor. Each sensor in the customized read head changes electrical resistance in response to magnetic field patterns detected on the tape. NIST developed the mechanical system for extracting a tape from its housing and transporting it over the read heads, the electronics interface, and software that convert maps of sensor resistance measures into digital images.

image of forensic tape analysis
This image, produced by the new NIST forensic tape analysis system, clearly reveals an overdubbing. The new recording is visible from the left bottom of the image to about 188 millimeters on the distance counter, the large smudge at 216 mm was made by the erase head, and the original recording is visible starting at about 220 mm.
Credit: NIST
View hi-resolution image

The upgrade included quadrupling the image resolution to 1600 dpi, the capability to scan both video and audio tapes, complete computer control of tape handling, and the capability to digitize the audio directly from the acquired image. The software displays the audio magnetic track pattern from the tape to identify tiny features, from over-recording marks to high-intensity signals from gunshots. The system is designed to analyze analog tapes but could be converted to work with digital tapes, according to project leader David Pappas.

The new nanoscale magnetic microscope also has been used experimentally for non-destructive evaluation of integrated circuits. By mapping tiny changes in magnetic fields across an integrated circuit, the device can build up an image of current flow and densities much faster and in greater detail than the single-sensor scanners currently used by the chip industry, says Pappas.

The FBI’s Forensic Audio, Video and Image Analysis Unit receives hundred of audio tapes for analysis annually, representing evidence from crimes such as terrorism, homicide and fraud. The FBI provided partial funding for development of the NIST tape imaging systems.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

back to top

Speed Bumps Less Important Than Potholes for Graphene

For electrical charges racing through an atom-thick sheet of graphene, occasional hills and valleys are no big deal, but the potholes—single-atom defects in the crystal—they’re killers. That’s one of the conclusions reached by researchers from the National Institute of Standards and Technology (NIST) and the Georgia Institute of Technology who created detailed maps of electron interference patterns in graphene to understand how defects in the two-dimensional carbon crystal affect charge flow through the material. The results, appearing in the July 13 issue of Science,* have implications for the design of graphene-based nanoelectronics.

Graphene Scatter Graphene
Graphene Graphene
Comparison of an STM topographic image of a section of graphene sheet (top left) with spectroscopy images of electron interference at three different energies shows strong interference patterns generated by atomic scale defects in the graphene crystal (red arrows) but only modest disturbances caused by larger scale bumps in the sheet (blue arrows.) Analysis of the ripples shows that the electron energy in graphene is inversely proportional to its wavelength, just like light waves. The area imaged is approximately 40 nanometers square.
Credit: NIST/Georgia Tech
View hi-resolution images: Upper left, upper right, lower left, lower right.

A single layer of carbon atoms tightly arranged in a honeycomb pattern, graphene was long thought to be an interesting theoretical concept that was impossible in practice—it would be too unstable, and crumple into some other configuration. The discovery, in 2004, that graphene actually could exist touched off a rush of experimentation to explore its properties. Graphene has been described as a carbon nanotube unrolled, and shares some of the unique properties of nanotubes. In particular, it’s a so-called ballistic conductor, meaning that electrons flow through it at high speed, like photons through a vacuum, with virtually no collisions with the atoms in the crystal. This makes it a potentially outstanding conductor for wires and other elements in nanoscale electronics.

Defects or irregularities in the graphene crystal, however, can cause the electrons to bounce back or scatter, the equivalent of electrical resistance, so one key issue is just what sort of defects cause scattering, and how much? To answer this, the NIST-Georgia Tech team grew layers of graphene on wafers of silicon carbide crystals and mapped the sheets with a custom-built scanning tunneling microscope (STM) in the NIST Center for Nanoscale Science and Technology that can measure both physical surface features and the interference patterns caused by electrons scattering in the crystal. (Graphene on silicon carbide is a leading candidate for graphene-based nanoelectronics.)

The results are counter-intuitive. Irregularities in the underlying silicon carbide cause bumps and dips in the graphene sheet that lies over it rather like a blanket on a lumpy bed, but these relatively large bumps have only a minor effect on the electron’s passage. In contrast, missing carbon atoms in the crystal lattice cause strong scattering, the interference patterns rippling around them like waves hitting the piles of a pier. From a detailed analysis of these interference patterns, the team verified that electrons in the graphene sheet behave like photons, even at the nanometer scale.

This work is supported in part by the Office of Naval Research, National Science Foundation and Intel Research.

*G.M. Rutter, J.N. Crain, N.P. Guisinger, T. Li, P.N. First and J.A. Stroscio. Scattering and interference in epitaxial graphene. Science 13 July 2007.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top

NIST Begins Technical Study of S.C. Warehouse Fire

The National Institute of Standards and Technology (NIST) has announced that it is beginning a technical study of the June 18, 2007, fire at the Super Sofa Store furniture store/warehouse in Charleston, S.C., that killed nine firefighters.

Within 36 hours of the fire, NIST had a reconnaissance team of fire experts on site to assess the situation and gather data about the fire as well as the subsequent collapse of the Super Sofa Store building. Based on that reconnaissance, NIST has determined that additional study is warranted.

The NIST team of fire experts has also participated, at the invitation of the National Institute for Occupational Safety and Health (NIOSH), in interviews of the Charleston Fire Department personnel who witnessed the fire. The NIST team will use the interview data to establish a timeline for reconstructing the fire in a computer simulation, both for NIOSH’s use and for further NIST research.

Other activities for the team could include: 

  • determining the conditions in the building prior to the initiating event (geometry, materials of construction and contents; location and conditions of doors, windows and ventilation; installed fire protection systems);
  • determining why and how the fire spread so quickly and, if necessary based on further evaluation, why and how the building collapsed so quickly; and
  • using computer simulations to study the impact on human survivability (referred to by researchers as “tenability”) in the Charleston fire if an installed sprinkler system had been in place.


A fire such as the one that occurred in Charleston raises a number of questions that, if answered, might serve as the basis for: 

  • improvements in how buildings are designed, constructed, maintained and used;
  • improved tools and guidance for the fire service and building owners;
  • recommended revisions to current model codes, standards and practices; and
  • improved public safety.


For example, NIST’s study of a 1999 townhouse fire in Washington, D.C., resulted in a computer model that determined the cause of the rapid burst of flames up a stairway that claimed the lives of two firefighters. Based on the finding, District of Columbia Fire Department operating procedures were changed and the computer simulation is now used as a training tool nationwide. 

NIST has more than 30 years of experience studying and investigating structural failures, including those as a result of fire, earthquakes and wind. For more information on these efforts, go to www.nist.gov/public_affairs/factsheet/bfrlinvestigations.htm. 

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

back to top

Evaluations Aim to Advance Translation Technology

Wartime military patrols and civilian encounters can be especially dangerous if neither group understands the other’s language. To help American forces secure critical information and communicate with the local population, National Institute of Standards and Technology (NIST) researchers are evaluating prototype, real-time, two-way translation systems for the Defense Advance Research Projects Agency (DARPA). 

The DARPA program called TRANSTAC (Spoken Language Communication and Translation System for Tactical Use) currently focuses on English and Iraqi Arabic. From July 16 to 20, NIST ran a series of laboratory and outdoor evaluation tests on prototype systems with English-speaking U.S. Marines and Iraqi Arabic speakers at its Gaithersburg, Md. campus. In each of the exercises NIST measured system capabilities in speech recognition, machine translation, noise robustness, user interface design and efficient performance on limited hardware platforms. 

“Effective two-way translation devices would represent a major advance in field translators,” according to Craig Schlenoff, project leader of the NIST evaluation project . “Although American forces in Iraq currently have the use of phrase-based translators, the devices can only translate English into pre-recorded Arabic phrases. They cannot translate Iraqi Arabic into English,” he said.

During the NIST laboratory and field tests the Marines and Iraqi Arabic speakers acted out 10 different scenarios—ranging from traffic checkpoints to neighborhood surveys—that required verbal communication.  Individuals in the laboratory tests looked directly at each other during the question and answer sessions.  Although their audible conversation was recorded on a laptop, neither party could see the screen.  Iraqi Arabic speakers, who understood English, also wore earphones that blocked out the English language query and, instead, relayed only the system’s Arabic interpretation of the question.  Background sounds were tightly controlled, so that the systems could be evaluated in a predictable environment.

The outdoor evaluations included background noises, such as other speakers, generators, opening garage doors, running vehicles and radio broadcasts, simulated more realistic conditions. The military personnel also carried the translator devices in back packs or in another hands-free manner, approximating future hardware developments that should provide American forces with small, even palm-sized translators that would not require attention or interfere with their ability to stay alert and vigilant.

“NIST evaluations provide DARPA with statistically significant data that shows the relative improvements of the TRANSTAC systems over time," said Schlenoff.  “Armed with this information DARPA is better able to make program decisions about which technologies are showing the most promise.”

Once the technology is fully developed, DARPA hopes to be able to develop an automatic translator system in a new language within 90 days of receiving a request for that language.

Media Contact: John Blair, john.blair@nist.gov, 301-975-4261

back to top

Tailoring Computer Security for Industrial Controls

Electricity, power, water, communications, waste treatment—we call the facilities that supply these services our “critical” infrastructures because they truly are critical to the health, safety and quality of everyday life. And because they are so critical, the computer systems used to control these infrastructure operations need to be rigorously protected against security breaches.

The National Institute of Standards and Technology (NIST) is soliciting public comments on a proposed expansion to its Special Publication 800-53 that provides specific requirements and guidance for protecting industrial control systems managed by federal agencies or their contractors. Produced through a partnership of NIST information technology and manufacturing engineering experts, the proposed expanded text is included in two appendices that describe how to tailor a computer security plan to an industrial control environment.

For example, certain requirements for general information security like screen locks that require the user to reenter a password after specific periods of inactivity, are impractical in some industrial control settings. Instead, compensating controls such as rigorous physical security controls, may need to be implemented to protect the system from unauthorized access at the console. Requirements for remote access monitoring or cryptography, as well as for testing of updates or patches also may need to be handled differently for industrial control systems than for general information systems.

“We produced these security strategies for federal agency use,” said NIST project leader Stuart Katzke, “however we hope that the private-sector industrial control community will consider adopting them as well.”

Under the Federal Information Security Management Act of 2002, NIST is tasked with producing computer security standards and guidelines to help federal agencies effectively protect and manage their information technology systems.

The draft appendices are available at: http://csrc.nist.gov/sec-cert/ics/draft-ics-interpretation_SP800-53.html.

NIST will accept comments on the text through Aug. 31, 2007. Comments should be forwarded to the NIST Computer Security Division, Information Technology Laboratory, 100 Bureau Drive, Mail Stop 8930, Gaithersburg, MD 20899-8930, or submitted via email to sec-ics@nist.gov.

As part of this project, NIST is planning a workshop in Knoxville, Tenn., on Aug. 16 and 17.  Information about the workshop can be obtained at: http://csrc.nist.gov/sec-cert/ics/events.html.

Media Contact: Gail Porter, gail.porter@nist.gov, 301-975-3392

back to top

Quicklinks

New NIST Measurement Office Focuses on Innovation

National Institute of Standards and Technology (NIST) Director William Jeffrey has named Clare Allocca as the chief of NIST’s new United States Measurement System (USMS) Office. The creation of the office marks the start of the second phase of NIST’s USMS effort to ensure that the nation’s measurement infrastructure—a large, diverse collection of private and public-sector organizations—can sustain U.S. innovation at a world-leading pace.

Phase I of the USMS effort culminated in February 2007 with a wide-ranging NIST assessment of the state of the nation’s measurement system and its impact on innovation. The USMS report, including contributions from more than 1,000 people in industry, academia and government, surveyed measurement needs across 11 industrial sectors and technology areas to identify more than 700 measurement-related barriers to innovation.

NIST’s USMS Office will take the lead in the implementation of Phase II, which includes continual collection of measurement needs of U.S. industry, government and the scientific community; the periodic assessment of the health of the U.S. Measurement System, through identification of priority areas needing improvements; and the facilitation of these improvements via dissemination of needs to appropriate measurement providers, including NIST.

Allocca brings a wealth of experience to her new role as USMS Office Chief. Previously, she served as scientific advisor to the director of NIST’s Materials Science and Engineering Laboratory. Her NIST career also includes positions in the agency’s Industrial Liaison Office, Program Office, Director’s Office and the Advanced Technology Program. Before joining NIST, she was a senior materials engineer for Pratt & Whitney.

"NIST will lead the USMS effort," says Allocca, "but our success in addressing priority measurement needs will really depend on close collaboration with other measurement providers, standards development organizations and many others."

For more on the USMS, including access to the assessment report, go to www.nist.gov/usms.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top