Take a sneak peek at the new NIST.gov and let us know what you think!
(Please note: some content may not be complete on the beta site.).
In This Issue...
Piece of Cake: Arrays of Long Nanotubes May Help Measure Terahertz Laser Power
Terahertz radiation can penetrate numerous materials—plastic, clothing, paper and some biological tissues—making it an attractive candidate for applications such as concealed weapons detection, package inspection and imaging skin tumors. However, to date there is no standard method for measuring the absolute output power of terahertz lasers, one source of this type of radiation. Now, researchers at the National Institute of Standards and Technology (NIST) have found that dense arrays of extra-long carbon nanotubes absorb nearly all light of long wavelengths, and thus are promising coatings for prototype detectors intended to measure terahertz laser power.*
The research is part of NIST's effort to develop the first U.S. reference standards** for calibrating lasers that operate in the terahertz range, from the far infrared at wavelengths of 100 micrometers to the edge of the microwave band at 1 millimeter.
"There is no measurement traceability for absolute power for terahertz laser sources," NIST project leader John Lehman says. "We have customers asking for the calibrations. This coating looks viable for terahertz laser power detectors."
The coating, called a VANTA (vertically aligned carbon nanotube array), has several desirable properties. Most obviously, it is easy to handle. The nanotubes are tens of micrometers to over a millimeter long, so a dense layer is visible without a microscope. A chunk of VANTA can be cut, lifted, and carried like a piece of cake, making it easy to transfer from a silicon surface where the tubes are grown to a laser power detector.
Most importantly, the coating is very dark. The NIST team evaluated three VANTA samples with average lengths of 40 and 150 micrometers and 1.5 millimeters (mm) and found that longer tubes reflect less light. The 1.5 mm version reflects almost no light—just 1 percent at a wavelength of 394 micrometers. This result, the first-ever evaluation of a VANTA's reflectance at that terahertz wavelength, indicates that virtually all arriving laser light is absorbed, which would enable highly accurate measurements of laser power.
The 1.5 mm VANTA absorbs more light than comparable coatings such as gold black, but more work is needed to calculate uncertainties and determine effects of factors such as light angle. The project extends NIST's long history in laser power measurements and Lehman's recent advances in ultradark nanotube coatings.***
VANTAs also have desirable thermal properties. NIST researchers found that the material absorbs and releases heat quickly compared to other black coatings, which will make the detectors more responsive and quicker to produce signals. Otherwise, a coating thick enough to absorb long wavelengths of light would not efficiently transmit heat to the detector.
In developing the capability for terahertz laser radiometry, NIST is building a terahertz laser designed for routine measurements and a detector called a thermopile to measure the laser's power. This simple detector design produces a voltage when heat is applied to a junction of two dissimilar metals. NIST researchers used the VANTA to coat a prototype thermopile. Further research is planned to design detectors that might be used as reference standards.
* J.H. Lehman, B. Lee and E.N. Grossman. Far infrared thermal detectors for radiometry using a carbon nanotube array. Applied Optics. Posted online July 18, 2011.
Nanomechanical measurements (model system and microimage of typical specimen). a) thin rigid film on elastic substrate b) initial strain induces surface wrinkles parallel to stress c) additional strain induces regular pattern of cracks in the film d) typical specimen imaged with optical profilometer (280 X 210 micrometers.)
Credit: Chung, Lee/NIST
Reverse-osmosis membranes, explains NIST researcher Chris Stafford, are an interesting challenge for the materials scientist. The membranes are used in water purification systems—a polyamide film no more than 200 nanometers thick backed by a thicker, porous support layer. Water holding dissolved salts or other contaminants is forced against one side of the membrane at substantial pressures up to about a thousand psi (roughly 7 megapascal), and comes out the other side leaving most of the impurities behind. The mechanical integrity of the membrane is obviously essential—it can't tear or develop pinhole leaks under the pressure—but engineers lacked a good way to measure the strength and breaking point, under stress, of these extremely thin films.
The NIST technique builds on earlier work by the team that demonstrated that you can reliably determine Young's modulus—a measure of stiffness or elasticity—for thin and ultrathin films by bonding it to a piece of silicon rubber, and then carefully stretching it in one direction. The film will develop a regularly spaced pattern of wrinkles (try it with a piece of plastic wrap), and the spacing of the wrinkles, the amount of stretch and some math gives you the modulus. In the new work, they basically pull harder until the film starts developing minute cracks crosswise to the tension. These too, it turns out, occur in regular patterns, and the spacing can be analyzed to determine both the fracture strength and the onset fracture strain, or the failure point, of the film.
Applying their technique to study the effect of chlorine on reverse-osmosis membranes, the team uncovered a puzzle. Chlorine in the water is known to cause a progressive deterioration in membrane performance, generally thought to be the result of prolonged chemical attack by the chlorine. Not so, according to the NIST team. "Chemically the chlorine attack is pretty quick," says Stafford. Spectroscopic chemical analysis showed that all the chemical damage from chlorine exposure happens in the first few hours. Tests using the wrinkle-crack method, however, show that the mechanical properties degrade continuously—the material becoming more and more stiff, brittle and weak—up to the longest duration tested, 10 days. "It may be an aging effect in polymers," says Stafford. "We're continuing to study that to figure out what's going on in there, because it's a real measurement challenge to get in on that length scale to follow the structure over time."
The project is part of a broader NIST program to study materials issues related to sustainable technologies like water purification, but the research team notes that the wrinkle-crack method itself would be broadly applicable to mechanical studies of almost any nanoscale thin film in fields as diverse as artificial skin, flexible electronics, thin-film sensors, fuel cells and photovoltaics.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
The electromagnetic force has gotten a little stronger, gravity a little weaker, and the size of the smallest "quantum" of energy is now known a little better. The National Institute of Standards and Technology (NIST) has posted* the latest internationally recommended values of the fundamental constants of nature.
The constants, which range from relatively famous (the speed of light) to the fairly obscure (Wien frequency displacement law constant) are adjusted every four years in response to the latest scientific measurements and advances. These latest values arrive on the verge of a worldwide vote this fall on a plan to redefine** the most basic units in the International System of Units (SI), such as the kilogram and ampere, exclusively in terms of the fundamental constants.
The values are determined by the Committee on Data for Science and Technology (CODATA) Task Group on Fundamental Constants,*** an international group that includes NIST members. The adjusted values reflect some significant scientific developments over the last four years.
Often the biggest news in a fundamental constant value is a reduced uncertainty—scientists know the value better. The uncertainty in the value of the fine-structure constant alpha (α = 7.297 352 5698 x 10-3), which dictates the strength of the electromagnetic force, has been slashed in half to 0.3 parts per billion (ppb). Since alpha can be measured in a uniquely broad range of phenomena from the recoil of atoms to the magnetic properties of electrons, the consistency of the measurements acts as a barometer of scientists' general understanding of physics. Alpha will also be a critical constant after a redefinition of the SI: it will remain an experimentally determined constant, while quite a few others' values will be fixed to define the basic measurement units.
Also improved is the Planck constant h, which defines the size of the smallest possible "quantum" (packet) of energy, and is central to efforts to redefine the SI unit of mass. The latest value of h (6.626 069 57 x 10-34 joule seconds) takes into account a measurement of the number of atoms in a highly enriched silicon sphere. That value currently disagrees with the other fundamental method for determining h, known as the watt-balance****. Even so, when all the values are combined, the overall uncertainty of h (44 ppb) is smaller than in 2006, and the values from the two techniques are getting closer to each other.
The 2010 CODATA values incorporate two new experimental measurements of G, the Newtonian constant of gravitation, which dictates the strength of gravity. The latest value of G (6.673 84 x 10-11 m3 kg-1 s-2) is about 66 parts per million smaller than the 2006 value. Other adjustments have been seen in the constants such as the radius of the proton and other constants related to atoms and gases such as the Rydberg and molar gas constants.
The CODATA task group is preparing a full report on the 2010 adjustments (for now, there is a brief overview*****), and the report will include recommendations for future measurements. A plan to adopt a fully constant-based SI, being voted upon this October by the General Conference on Weights and Measures, is contingent upon the values of the fundamental constants such as h reaching certain levels of precision and accuracy that will require further measurement advances in the coming years.
Media Contact: Ben Stein, email@example.com, 301-975-3097
With increasing dependency on information systems and advances in cloud computing, the smart grid and mobile computing, maintaining the confidentiality and integrity of citizens' personally identifiable information is a growing challenge. A new draft document from the National Institute of Standards and Technology (NIST) addresses that challenge by adding privacy controls to the catalog of security controls used to protect federal information and information systems.
Personally identifiable information (PII) is information that is unique to an individual, such as a social security number, birth information, fingerprints and other biometrics. In the wrong hands, PII can be used in identity theft, fraud or other criminal activities. Today, more than ever, citizens are concerned that their personal information is protected as it is processed, stored and transmitted across computing clouds or mobile devices in the federal government and in other areas such as health care and banking. Protecting PII is a key goal of the federal government.
"Strong normalized privacy controls are an essential component in the ongoing effort to build measurable privacy compliance," said NIST Senior Internet Policy Advisor Ari Schwartz. "Certainty in controls and measures can help promote privacy, trust and greater confidence in new standards."
The new document, Privacy Control Catalog, will become Appendix J of Security Controls for Federal Information Systems and Organizations (NIST Special Publication 800-53, Revision 4). One of the foundational Federal Information Security Management Act (FISMA) documents, SP 800-53 is being updated to Revision 4 in December, 2011. SP 800-53 is also one of the Joint Task Force Transformation Initiative documents that NIST produces with the Department of Defense and the Intelligence Community.
"Privacy and security controls in federal information systems are complementary and mutually reinforcing in trying to achieve the privacy and security objectives of organizations," said NIST Fellow Ron Ross, project leader of the FISMA Implementation Project and Joint Task Force.
Incorporating privacy controls into SP 800-53 and taking advantage of established security controls to provide a solid foundation for information security helps to ensure that privacy requirements will be satisfied in a comprehensive, cost-effective, and risk-based manner.
The new privacy appendix:
Provides a structured set of privacy controls, based on international standards and best practices, that help organizations enforce requirements deriving from federal privacy legislation, policies, regulations, directives, standards and guidance;
Establishes a linkage and relationship between privacy and security controls for purposes of enforcing respective privacy and security requirements, which may overlap in concept and in implementation within federal information systems and organizations;
Demonstrates the applicability of the NIST Risk Management Framework in the selection, implementation, assessment and monitoring of privacy controls deployed in federal information systems and organizations; and
Promotes closer cooperation between privacy and security officials within the federal government to help achieve the objectives of senior leaders/executives in enforcing the requirements in federal privacy legislation, policies, regulations, directives, standards and guidance.
Due to the special nature of the material in Appendix J, it is being vetted separately from other changes to the main document. The public comment period for this appendix runs through September 2, 2011. Comments should be sent to firstname.lastname@example.org. The publication may be found at http://csrc.nist.gov/publications/PubsDrafts.html#SP-800-53-Appendix%20J
In addition to the basic privacy controls in Appendix J, NIST plans to develop assessment procedures to allow organizations to evaluate the effectiveness of the controls on an ongoing basis. Standardized privacy controls and assessment procedures will provide a more disciplined and structured approach for satisfying federal privacy requirements and demonstrating compliance to those requirements, Ross said.
Media Contact: Evelyn Brown, email@example.com, 301-975-5661
Researchers at the National Institute of Standards and Technology (NIST) have released for public comment updated specifications for the Security Content Automation Protocol (SCAP), which helps organizations find and manage computer-system vulnerabilities more effectively by standardizing the way vulnerabilities are identified, prioritized and reported.
SCAP unites and organizes a collection of computer security specifications and reference data to support automated security programs that check vulnerabilities in information systems, such as configuration errors, missing software "patches," misapplied security settings and many others. SCAP-based security tools are particularly valuable for securing large, complex information systems and organizations with many distributed computing systems.
System operations and security professionals use SCAP-based software products to determine the system's status, particularly information about software flaws and security configuration information in an efficient, accurate way. For example, SCAP enables automated assessment of software patches present on a system that identifies the potential security risk to an organization due to an unpatched vulnerability. Using SCAP, information system administrators can address critical vulnerabilities mitigating the risk of attack.
In The Technical Specification for the Security Content Automation Protocol (SCAP): SCAP Version 1.2 (NIST Special Publication 800-126 Revision 2) some underlying specifications have been enhanced in response to requests from SCAP content authors and product developers. SCAP has been updated to incorporate three new underlying specifications to the protocol that add asset reporting, asset identification and a digital trust model to help ensure the integrity of SCAP data itself.
The digital trust model in SP 800-126 Rev. 2 is described in the draft NIST Interagency Report 7802: Trust Model for Security Automation Data 1.0. The model has applications beyond the security automation environment. It permits users to establish integrity, authentication and traceability of security automation XML data using a 21st century version of a 17th century king's wax seal on a scroll. The trust model is based on existing specifications from the World Wide Web Consortium and describes features that will enhance the trustworthiness of information.
The U.S. federal government employs SCAP to support security activities and initiatives. Academia and private industry, such as finance, manufacturing and health care, also use the protocol as it provides a standardized situational awareness view of IT systems that can be used by system administrators, security officers and executives to make security decisions.
SCAP is expected to evolve and expand to support the growing needs to define and measure effective security controls, assess and monitor ongoing aspects of information security, and successfully manage systems in accordance with risk management frameworks such as NIST SP 800-53, Department of Defense Instruction 8500.2, and the Payment Card Industry (PCI) framework.
Authors request comments on both publications by August 1, 2011. The Technical Specification for the Security Content Automation Protocol (SCAP): SCAP Version 1.2 (NIST Draft Special Publication 800-126 Revision 2) can be found at http://csrc.nist.gov/publications/drafts/800-126r2/draft-SP800-126r2.pdf. Comments should be sent to firstname.lastname@example.org with "Comments SP 800-126" in the subject line. Trust Model for Security Automation Data 1.0 can be found at http://csrc.nist.gov/publications/drafts/nistir-7802/Draft-nistir-7802.pdf. Comments should be sent to email@example.com with "Comments IR 7802" in the subject line.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
The National Institute of Standards and Technology (NIST) today* announced that manufacturing industry executive Michael F. Molnar has been appointed to be the agency’s first-ever Chief Manufacturing Officer.
NIST Chief Manufacturing Officer Michael F. Molnar
Credit: Courtesy Cummins Inc.
The manufacturing sector is critical to the U.S. economy, and the Obama administration is committed to building domestic manufacturing capabilities to create the new products, new industries and new jobs of the future. NIST is particularly well-positioned to support this goal because of its unique mission to work closely with industry. This new position will leverage NIST’s strong relationships with industry to accelerate innovation that will create 21st-century manufacturing jobs and enhance our global competitiveness. As part of this effort, the position will support the broader Advanced Manufacturing Partnership recently launched by President Obama that brings industry, universities and the federal government together to invest in emerging technologies.
“We look forward to having Mike join the NIST team,” said Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher. “His background in manufacturing research and development, collaborative engineering, and sustainable products and processes, combined with his policy experience, make him uniquely suited for this position.”
As Chief Manufacturing Officer, Molnar will be responsible for planning and coordination of the Institute’s broad array of manufacturing research and services programs. He will serve as NIST’s central point of contact with the White House, the Department of Commerce and other agencies on technical and policy issues related to manufacturing.
Molnar has extensive industrial experience, with leadership roles in manufacturing technology, advanced manufacturing engineering, metrology and quality systems. He currently serves as Director of Environmental Policy and Sustainable Development at the Columbus, Ind., headquarters of Cummins Inc. Cummins is a $14 billion international company that designs and manufactures commercial engines and power generation systems.
Molnar has served as a federal fellow in the White House Office of Science and Technology Policy, and was elected as a fellow of both the American Society of Mechanical Engineers and the Society of Manufacturing Engineers. He is a licensed professional engineer, a certified manufacturing engineer and a certified energy manager.
Molnar received a Master of Business Administration from the University of Notre Dame in Indiana and both a Master of Science in manufacturing systems engineering and a Bachelor of Science in mechanical engineering from the University of Wisconsin-Madison. He is an active member of professional societies, consortia and volunteer organizations.
Molnar will begin working at NIST on August 29, 2011.
* Originally posted on July 8, 2011.
Media Contact: Jennifer Huergo, email@example.com, 301-975-6343