In This Issue...
Nanowire-based Sensors Offer Improved Detection of Volatile Organic Compounds
A team of researchers from the National Institute of Standards and Technology (NIST), George Mason University and the University of Maryland has made nano-sized sensors that detect volatile organic compounds—harmful pollutants released from paints, cleaners, pesticides and other products—that offer several advantages over today's commercial gas sensors, including low-power room-temperature operation and the ability to detect one or several compounds over a wide range of concentrations.
The recently published work* is proof of concept for a gas sensor made of a single nanowire and metal oxide nanoclusters chosen to react to a specific organic compound. This work is the most recent of several efforts at NIST that take advantage of the unique properties of nanowires and metal oxide elements for sensing dangerous substances.
Modern commercial gas sensors are made of thin, conductive films of metal oxides. When a volatile organic compound like benzene interacts with titanium dioxide, for example, a reaction alters the current running through the film, triggering an alarm. While thin-film sensors are effective, many must operate at temperatures of 200° C (392° F) or higher. Frequent heating can degrade the materials that make up the films and contacts, causing reliability problems. In addition, most thin-film sensors work within a narrow range: one might catch a small amount of toluene in the air, but fail to sniff out a massive release of the gas. The range of the new nanowire sensors runs from just 50 parts per billion up to 1 part per 100, or 1 percent of the air in a room.
These new sensors, built using the same fabrication processes that are commonly used for silicon computer chips, operate using the same basic principle, but on a much smaller scale: the gallium nitride wires are less than 500 nanometers across and less than 10 micrometers in length. Despite their microscopic size, the nanowires and titanium dioxide nanoclusters they're coated with have a high surface-to-volume ratio that makes them exquisitely sensitive.
"The electrical current flowing through our nanosensors is in the microamps range, while traditional sensors require milliamps," explains NIST's Abhishek Motayed. "So we're sensing with a lot less power and energy. The nanosensors also offer greater reliability and smaller size. They're so small that you can put them anywhere." Ultraviolet light, rather than heat, promotes the titanium dioxide to react in the presence of a volatile organic compound.
Further, each nanowire is a defect-free single crystal, rather than the conglomeration of crystal grains in thin-film sensors, so they're less prone to degradation. In reliability tests over the last year, the nano-sized sensors have not experienced failures. While the team's current experimental sensors are tuned to detect benzene as well as the similar volatile organic compounds toluene, ethylbenzene and xylene, their goal is to build a device that includes an array of nanowires and various metal oxide nanoclusters for analyzing mixtures of compounds. They plan to collaborate with other NIST teams to combine their ultraviolet light approach with heat-induced nanowire sensing technologies.
The portion of this work conducted at George Mason University was funded by the National Science Foundation.
* G.S. Aluri, A. Motayed, A.V. Davydov, V.P. Oleshko, K.A. Bertness, N.A. Sanford and M.V. Rao. Highly selective GaN-nanowire/TiO2-nanocluster hybrid sensors for detection of benzene and related environment pollutants. Nanotechnology. 22 295503 doi: 10.1088/0957-4484/22/29/295503
Media Contact: Michael Baum, email@example.com, 301-975-2763
Branch Offices: New Family of Gold-Based Nanoparticles Could Serve as Biomedical 'Testbed'
Gold nanoparticles are becoming the … well … gold standard for medical-use nanoparticles. A new paper* by researchers from the National Institute of Standards and Technology (NIST) and the National Cancer Institute's Nanotechnology Characterization Laboratory (NCL) proposes not only a sort of gold nanoparticle "testbed" to explore how the tiny particles behave in biological systems, but also a paradigm for how to characterize nanoparticle formulations to determine just what you're working with.
Prospective uses of gold nanoparticles, says NIST chemist Vince Hackley, include high-precision drug-delivery systems and diagnostic image enhancers. Gold is nontoxic and can be fashioned into particles in a range of sizes and shapes. By itself, gold doesn't do much biologically, but it can be "functionalized" by attaching, for instance, protein-based drugs along with targeting molecules that cluster preferentially around cancer cells. The nanoparticles are generally coated as well, to prevent them from clumping together and to avoid rapid clearance by the body's immune system.
NCL's Anil Patri notes that the coating composition, density and stability have a profound impact on the nanomaterial safety, biocompatibility (how well the nanoparticles distribute in the body), and efficacy of the delivery system. "Understanding these parameters through thorough characterization would enable the research community to design and develop better nanomaterials," he says.
To facilitate such studies, the NIST/NCL team set out to create a nanoparticle testbed—a uniform, controllable core-shell nanoparticle that could be made-to-order with precise shape and size, and to which could be attached nearly any potentially useful functionality. Researchers then could study how controlled variations fared in a biological system.
Their trial system is based on regularly shaped branching molecules called dendrons, a term derived from the Greek word for "tree." Dendron chemistry is fairly new, dating from the 1980s. They're excellent for this use, says NIST researcher Tae Joon Cho, because the individual dendrons are always the same size, unlike polymers, and can readily be modified to carry "payload" molecules. At the same time, the tip of the structure—the "tree's" trunk—is designed to bond easily to the surface of a gold nanoparticle.
The team made an exhaustive set of measurements so they could thoroughly describe their custom-made dendron-coated nanoparticles. "There aren't a lot of protocols around for characterizing these materials—their physical and chemical properties, stability, et cetera," Hackley says, "so, one of the things that came out of the project is a basic series of measurement protocols that we can apply to any kind of gold-based nanoparticle."
Any single measurement technique, he says, is probably inadequate to describe a batch of nanoparticles, because it likely will be insensitive to some size ranges or confused by other factors—particularly if the particles are in a biological fluid.
The new NIST/NCL paper provides the beginnings of a catalog of analysis techniques for getting a detailed lowdown on nanoparticles. These techniques include nuclear magnetic resonance spectroscopy, matrix-assisted laser desorption/ionization mass spectrometry, dynamic light scattering, ultra-violet/visible spectroscopy and X-ray photoelectron spectroscopy. The dendron-coated nanoparticles also were tested for stability under "biologically relevant" conditions of temperature, acidity and some recognized forms of chemical attack that would take place in the bloodstream. In vitro biological tests are pending.
The work was funded in part by the National Cancer Institute, National Institutes of Health.
* T.J. Cho, R.A. Zangmeister, R.I. MacCuspie, A.K. Patri and V.A. Hackley. Newkome-type dendron-stabilized gold nanoparticles: synthesis, reactivity, and stability. Chem. Mater. 2011,23,2665-2676. dx.doi.org/10.1021/cm200591h.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
Final Version of Industrial Control Systems Security Guide Published
The National Institute of Standards and Technology (NIST) has issued the final version of its Guide to Industrial Control Systems (ICS) Security (SP 800-82),* intended to help pipeline operators, power producers, manufacturers, air traffic control centers and other managers of critical infrastructures to secure their systems while addressing their unique performance, reliability, and safety requirements.
Finalized after three rounds of public review and comment, the guide is directed specifically to federally owned or operated industrial control systems (ICS), including those run by private contractors on behalf of the federal government. Examples include the mail handling operations, air traffic control towers, and some electricity generation and transmission facilities and weather observation systems. However, the guide's potential audience is far larger and more diverse than the federal government, since about 90 percent of the nation's critical infrastructure is privately owned.
The guide responds to responsibilities assigned to NIST under the Federal Information Security Management Act (FISMA). The law directs NIST to develop information security standards and guidelines for non-national security federal information systems. While these FISMA-related specifications are not mandatory for the private sector or state and local governments, many businesses and other organizations have adopted the NIST-developed standards and guidelines. Drafts of the new document have been downloaded more than 1,000,000 times, and the guide already is referenced in industry-specific security publications.
Industrial control systems include supervisory control and data acquisition (SCADA) systems, distributed control systems and programmable logic controllers. The scope of facilities and equipment encompassed by these technologies range from broadly dispersed operations, such as natural gas pipelines and water distribution systems, down to individual machines and processes.
Most industrial control systems began as proprietary, stand-alone systems that were separated from the rest of the world and isolated from most external threats. Today, widely available software applications, Internet-enabled devices and other nonproprietary IT offerings have been integrated into most such systems. This connectivity has delivered many benefits, but it also has increased the vulnerability of these systems to malicious attacks, equipment failures and other threats.
As a rule, these systems must operate continuously and reliably, often around the clock. Unlike information technology (IT) systems, which process, store, and transmit digital data, industrial control systems typically monitor the system environment and control physical objects and devices, such as pipeline valves. Disruptions or failures can result in death or injury, property damage, and loss of critical services.
Due to these unique performance, reliability and safety requirements, securing industrial control systems often requires adaptations and extensions to the NIST-developed security standards and guidelines for IT systems only. The new guide describes these adaptations and extensions, provides an overview of various systems and their organizational layouts, describes typical threats and vulnerabilities, and recommends appropriate countermeasures.
"Securing an industrial control system requires a proactive, collaborative effort that engages cyber security experts, control engineers and operators and other experts and experienced workers," says NIST mechanical engineer and lead author Keith Stouffer. "It also requires factoring in—and addressing—new risks introduced by the evolving 'smart' electric power grid."
Stouffer recommends using the new guide along with Guidelines for Smart Grid Cyber Security (NISTIR 7628), which NIST issued last September, to tackle security issues arising from the convergence of the electric power Smart Grid and ICS.
The free 155-page guide can be downloaded from the NIST Computer Security Resource Center at: http://csrc.nist.gov/index.html.
*K. Stouffer, J. Falco and K. Scarfone, Guide to Industrial Control Systems (ICS) Security (SP 800-82). June 2011.
Media Contact: Mark Bello, email@example.com, 301-975-3776
Baldrige Program Takes Performance Excellence on the Road
Interested in improving your organization's excellence? Then sign up for one of the two 2011 Baldrige Regional Conferences. This year, they're being held in Missouri and Alabama.
Senior leaders from the seven organizations selected for the 2010 Malcolm Baldrige National Quality Award, as well as representatives from previous award recipients, will share their best practices and results during the events on Sept. 13, 2011, at the Westin Crown Center, Kansas City, Mo., and on Sept. 27, 2011, at the Renaissance Birmingham Ross Bridge Golf Resort & Spa, Birmingham, Ala.
Participants can network with the recipients and gather valuable tips on organizational improvement, innovation and performance management, as well as advice on how to successfully apply the Baldrige Criteria for Performance Excellence.
The 2010 Baldrige Award recipients—listed with their category—are:
A pre-conference workshop is being offered on the day before both conferences to familiarize those new to the Baldrige process with the basics of the Baldrige Criteria, the benefits of self-assessment and the resources available to help organizations begin an organizational excellence program. The workshop is not a prerequisite for attending the conference.
For more information about, or to register online for, the conferences, go to http://www.nist.gov/baldrige/regionals/index.cfm. You also may contact the Baldrige Performance Excellence Program for details at (301) 975-2036 or firstname.lastname@example.org.
An early registration discount is available for the Missouri conference until Aug. 23, 2011, and for the Alabama conference until Sept. 6, 2011.
The 2011 Regional Conferences are co-sponsored by the Excellence in Missouri Foundation, the Alabama Productivity Center and the Alliance for Performance Excellence.
Media Contact: Michael E. Newman, email@example.com, 301-975-3025
Researchers Share Useful Lessons Learned in Evaluating Emerging Technologies
Most industry executives, military planners, research managers or venture capitalists charged with assessing the potential of an R&D project probably are familiar with the wry twist on Arthur C. Clarke's third law*: "Any sufficiently advanced technology is indistinguishable from a rigged demo."
After serving for five years as independent evaluators of emerging military technologies nurtured by the Defense Advanced Research Projects Agency (DARPA), a team from the National Institute of Standards and Technology (NIST) shares critical "lessons learned" that can help businesses and others negotiate the promises and pitfalls encountered when pushing the technology envelope to enable new capabilities.
Writing in the International Journal of Intelligent Control and Systems,** the NIST researchers also describe the evaluative framework they devised for judging the performance of a system and its components as well as the utility of the technology for the intended user. Called SCORE (System, Component, and Operationally Relevant Evaluations), the framework is a unified set of criteria and software tools for evaluating emerging technologies from different perspectives and levels of detail and at various stages of development.
SCORE was developed for evaluating so-called intelligent systems—a fast growing category of technologies ranging from robots and unmanned vehicles to sensor networks, natural language processing devices and "smart" appliances. By definition, explains Craig Schlenoff, acting head of NIST's Systems Integration Group, "Intelligent systems can respond to conditions in an uncertain environment—be it a battlefield, a factory floor, or an urban highway system—in ways that help the technology accomplish its intended purpose."
Schlenoff and his colleagues used their SCORE approach to evaluate technologies as they progressed under two DARPA programs: ASSIST and TRANSTAC. In ASSIST, DARPA is funding efforts to instrument soldiers with wearable sensors—video cameras, microphones, global positioning devices and more—to continuously record activities while they are on a mission. TRANSTAC is driving the development of two-way speech-translation systems that enable speakers of different languages to communicate with each other in real-world situations, without an interpreter. By providing constructive feedback on system capabilities, the SCORE evaluative framework helps to drive innovation and performance improvements.
Several lessons learned recounted by the NIST team are aimed at maximizing the contributions of test subjects and the developers of technologies without biasing test results. "There is often a balancing act," they write, "between creating the evaluation environment in a way that shows the system in the best possible light vs. having an environment that is as realistic as possible."
They also discuss unavoidable trade-offs due to costs, logistics, or other factors. While evaluators and technology developers should never lose sight of their ultimate objective, the NIST researchers also advise the need for flexibility over time. As more is learned about the system and about user requirements, features may change and project goals may be modified, necessitating adjustments to the evaluation approach.
"The main lesson," Schlenoff explains, "is that the extra effort devoted to evaluation planning can have a huge effect on how successful the evaluation will be. Bad decisions made during the design can be difficult and costly to fix later on."
* "Any sufficiently advanced technology is indistinguishable from magic."
**C. Schlenoff, B. Weiss and M. Steves. A detailed discussion of lessons learned in evaluating emerging and advanced military technologies. International Journal of Intelligent Control and Systems. Forthcoming. To learn more about SCORE, go to: http://www.nist.gov/el/isd/ks/score.cfm.
Media Contact: Mark Bello, firstname.lastname@example.org, 301-975-3776
Calling for Comments on Cybersecurity, Innovations and the Internet Economy
The Department of Commerce's Internet Policy Task Force is requesting comments on a report that proposes voluntary codes of conduct to strengthen the cybersecurity of companies that increasingly rely on the Internet to do business, but are not part of the critical infrastructure sector.
The report, Cybersecurity, Innovations and the Internet Economy, focuses on the "Internet and Information Innovation Sector" (I3S)—businesses that range from small and medium enterprises and bricks-and-mortar firms with online services, to social networking sites and Internet-only business, to cloud computing firms that are increasingly subject to cyber attacks.
"Our economy depends on the ability of companies to provide trusted, secure services online," writes Commerce Secretary Gary Locke in the publication. "As new cybersecurity threats evolve, it's critical that we develop policies that better protect businesses and their customers to ensure the Internet remains an engine for economic growth."
Global online transactions are currently estimated by industry analysts at $10 trillion annually. As Internet business grows, so has the threat of cybersecurity attacks. The number of Internet malware threats was estimated to have doubled between January 2009 and December 2010. In 2010, an estimated 55,000 new viruses, worms, spyware and other threats were bombarding the Internet daily.
Cybersecurity, Innovation and the Internet Economy, makes a number of specific recommendations for reducing I3S vulnerabilities:
A June 15, 2011, Federal Register notice requests comments on the task force report by August 1, 2011. Comments should be emailed to SecurityGreenPaper@nist.gov with the subject line ''Comments on Cybersecurity Green Paper.'' The task force will continue to work with others in government to engage the domestic and global privacy community, and will consider publishing a refined set of policy recommendations in the future.
Media Contact: Evelyn Brown, email@example.com, 301-975-5661
NIST Contributions to Smart Grid Highlighted in White House Report, Event
A new White House policy document released today* highlights strategic roles that the National Institute of Standards and Technology (NIST) plays in accelerating the modernization of the nation's electric infrastructure, bolstering electric-grid innovation and advancing a clean energy economy, in part by advancing "smart grid" technologies. The White House unveiled the document today at its "Building the 21st Century Grid" event, which brought together many cabinet departments and agencies working together toward these goals.
The document, "A Policy Framework for the 21st Century Grid," was issued by the National Science and Technology Council (NSTC), administered by the White House's Office of Science and Technology Policy. It provides a roadmap for making investments in the nation's electric infrastructure in ways that benefit all Americans. Planned changes to the grid include its evolution into a far more modern system, integrating renewable sources of electricity, accommodating the growing number of electric vehicles, and helping to avoid blackouts and restore power more quickly when outages occur. To make these improvements, the strategy rests on four pillars, including one in which NIST plays a leading role: a focus on standards and interoperability to enable greater innovation in modernizing the grid.
Because updating the grid will require a host of interoperability standards to ensure its millions of devices and systems can operate effectively with one another, NIST has been working to bring various industry groups together to coordinate the development of these standards—an effort the White House highlights in the report.
"This document is the culmination of nine months of work by the NSTC subcommittee on Smart Grid to develop an administration policy framework for modernization," said George Arnold, the National Coordinator for Smart Grid Interoperability at NIST. "It highlights the importance of standards to ensure that innovation can be introduced into the grid without stranding previous investments. NIST has played a major role in the modernization effort along with the Department of Energy."
The event also served to highlight opportunities for innovation in the nation's electric infrastructure and to announce a number of new initiatives for achieving a clean energy economy, including NIST's work with the Asia-Pacific Economic Cooperation (APEC). NIST, together with the U.S. Trade Representative, has launched a collaborative initiative in APEC on Smart Grid interoperability standards, crucial to increasing market opportunities throughout the region, including for American firms.
"APEC's 21 countries are concerned with facilitating economic cooperation in the Asia-Pacific region, and each year they pick a key topic to advance regulatory cooperation to facilitate trade," Arnold said. "This year APEC elected to focus on Smart Grid because all these countries are modernizing their grids. These standards can either be a barrier to trade or facilitate it. If we can achieve a commonality of standards, manufacturers can build equipment and then sell it across the region."
For more information on the Grid Modernization Event and the report, visit the White House's fact sheet at www.whitehouse.gov/administration/eop/ostp/pressroom/06132011.
* Originally released on June 13, 2011.
Media Contact: Chad Boutin, firstname.lastname@example.org, 301-975-4261