Take a sneak peek at the new NIST.gov and let us know what you think!
(Please note: some content may not be complete on the beta site.).

View the beta site
NIST logo

Tech Beat - November 8, 2011

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: November 8, 2011
Date Modified: November 8, 2011 
Contact: inquiries@nist.gov

Fish Flu: Genetics Approach May Lead to Treatment

A research team at the National Institute of Standards and Technology (NIST) has provided the first look* at a genetic structure that may play a critical role in the reproduction of the infectious salmon anemia virus (ISAV), more commonly known as the “fish flu.” A scourge in fish farms with a mortality rate as high as 90 percent, ISAV was recently found in wild salmon in the Pacific Northwest for the first time, threatening an already dwindling population and the vast food web it supports.

Salmon swimming within a netted pen at a fish farm in Maine.
Credit: Department of Marine Resources, State of Maine
View hi-resolution image

While there is a vaccine for the virus, it must be administered by injection—a task that is both cumbersome and economically impractical for the aquaculture industry. A drug or vaccine that prevents the spread of the disease by interfering with the virus’ ability to replicate its genetic code (contained in eight segments of ribonucleic acid or RNA) would be far more practical for fish farmers and marine biologists to deliver.

Robert Brinson, a NIST scientist working at the Hollings Marine Laboratory (HML) in Charleston, S.C., and NIST colleagues Andrea Szakal and John Marino working at the Institute for Bioscience and Biotechnology Research (IBBR) in Rockville, Md., knew from the scientific literature that the family of viruses that includes both the many types of influenza—the causes of yearly human flu outbreaks—and infectious salmon anemia, form “panhandle” structures in their genomic RNA. In human influenza, these panhandles are known to interact with proteins that begin the process of copying and replicating the virus.

Hypothesizing that the fish flu virus might function the same way, Brinson and his colleagues used high-resolution nuclear magnetic resonance (NMR) spectroscopy and thermal melting methods to look at the genetic structure in the same region of the ISAV RNA. They found that the ISAV genome does appear to have a panhandle “motif” (the poetic term used by geneticists to define a discrete nucleotide sequence that functions independently of the rest of the genome and directs a specific biological function). The NIST work provides the first experimental evidence and characterization of this panhandle motif that may function similarly in the fish virus as the “green light” for viral RNA replication processes.

“The next step,” Brinson says, “is to investigate the relevant proteins and how they interface with the RNA. What molecular features drive the protein to recognize the RNA? How is it binding, and how is it interacting with the RNA?” Brinson says that this work may facilitate the development of new approaches for interfering with the replication of these viruses to mitigate the effects of ISAV on the aquaculture of salmon. It also demonstrates that the salmon virus can be used as an experimental model for understanding the replication machinery of human and other related influenza viruses.

The HML is a unique partnership of governmental and academic agencies including NIST, NOAA's National Ocean Service, the South Carolina Department of Natural Resources, the College of Charleston and the Medical University of South Carolina. The HML NMR facility focuses on the multi-institutional mission of metabolomics, natural products and structural biology. The IBBR is a University System of Maryland joint research enterprise created to enhance collaboration among the University of Maryland College Park, The University of Maryland Baltimore and NIST.

*R.G. Brinson, A.L. Szakal and J.P. Marino. Structural characterization of the viral and complementary RNA panhandle motifs from the Infectious Salmon Anemia Virus. Journal of Virology.Published online Oct. 12, 2011.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

Are Electron Tweezers Possible? Apparently So.

Video clip showing electron tweezers at work. Credit: Oleshko,NIST

Not to pick up electrons, but tweezers made of electrons. A recent paper* by researchers from the National Institute of Standards and Technology (NIST) and the University of Virginia (UVA) demonstrates that the beams produced by modern electron microscopes can be used not just to look at nanoscale objects, but to move them around, position them and perhaps even assemble them.

Essentially, they say, the tool is an electron version of the laser “optical tweezers” that have become a standard tool in biology, physics and chemistry for manipulating tiny particles. Except that electron beams could offer a thousand-fold improvement in sensitivity and resolution.

Optical tweezers were first described in 1986 by a research team at Bell Labs. The general idea is that under the right conditions, a tightly focused laser beam will exert a small but useful force on tiny particles. Not pushing them away, which you might expect, but rather drawing them towards the center of the beam. Biochemists, for example, routinely use the effect to manipulate individual cells or liposomes under a microscope.

If you just consider the physics, says NIST metallurgist Vladimir Oleshko, you might expect that a beam of focused electrons—such as that created by a transmission electron microscope (TEM)—could do the same thing. However that’s never been seen, in part because electrons are much fussier to work with. They can’t penetrate far through air, for example, so electron microscopes use vacuum chambers to hold specimens.

So Oleshko and his colleague, UVA materials scientist James Howe, were surprised when, in the course of another experiment, they found themselves watching an electron tweezer at work. They were using an electron microscope to study, in detail, what happens when a metal alloy melts or freezes. They were observing a small particle—a few hundred nanometers wide—of an aluminum-silicon alloy held just at a transition point where it was partially molten, a liquid shell surrounding a core of still solid metal. In such a small sample, the electron beam can excite plasmons, a kind of quantized wave in the alloy’s electrons, that reveals a lot about what happens at the liquid-solid boundary of a crystallizing metal. “Scientifically, it’s interesting to see how the electrons behave,” says Howe, “but from a technological point of view, you can make better metals if you understand, in detail, how they go from liquid to solid.”

“This effect of electron tweezers was unexpected because the general purpose of this experiment was to study melting and crystallization,” Oleshko explains. “We can generate this sphere inside the liquid shell easily; you can tell from the image that it’s still crystalline. But we saw that when we move or tilt the beam—or move the microscope stage under the beam—the solid particle follows it, like it was glued to the beam.”

Potentially, Oleshko says, electron tweezers could be a versatile and valuable tool, adding very fine manipulation to wide and growing lists of uses for electron microscopy in materials science.** “Of course, this is challenging because it requires a vacuum,” he says, “but electron probes can be very fine, three orders of magnitude smaller than photon beams—close to the size of single atoms. We could manipulate very small quantities, even single atoms, in a very precise way.”

* V.P. Oleshko and J.M. Howe. Are electron tweezers possible? Ultramicroscopy (2011) doi:10.1016/j.ultramic.2011.08.015.
** See, for example, the Jan. 19, 2011, Tech Beat story “NIST Puts a New Twist on the Electron Beam” at www.nist.gov/public_affairs/tech-beat/tb20110119.cfm#tem.
Edited on Nov. 18, 2011, to correct "microns" to "nanometers" in the fifth paragraph.

Media Contact: Michael Baum, baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

NIST Physicists Chip Away at Mystery of Antimatter Imbalance

Why there is stuff in the universe—more properly, why there is an imbalance between matter and antimatter—is one of the long-standing mysteries of cosmology. A team of researchers working at the National Institute of Standards and Technology (NIST) has just concluded a 10-year-long study of the fate of neutrons in an attempt to resolve the question, the most sensitive such measurement ever made. The universe, they concede, has managed to keep its secret for the time being, but they’ve succeeded in significantly narrowing the number of possible answers.

antimatter diagrams
Two types of neutron decay produce a proton, an electron and an electron antineutrino but eject them in different configurations, The experiments at NIST detected no imbalance, but the improved sensitivity could help place limits on competing theories about the matter-antimatter imbalance in the universe.
Credit: emiT team
View hi-resolution image

Though the word itself evokes science fiction, antimatter is an ordinary—if highly uncommon—material that cosmologists believe once made up almost exactly half of the substance of the universe. When particles and their antiparticles come into contact, they instantly annihilate one another in a flash of light. Billions of years ago, most of the matter and all of the antimatter vanished in this fashion, leaving behind a tiny bit of matter awash in cosmic energy. What we see around us today, from stars to rocks to living things, is made up of that excess matter, which survived because a bit more of it existed.

“The question is, why was there an excess of one over the other in the first place?” says Pieter Mumm, a physicist at NIST’s Physical Measurements Lab. “There are lots of theories attempting to explain the imbalance, but there’s no experimental evidence to show that any of them can account for it. It’s a huge mystery on the level of asking why the universe is here. Accepted physics can’t explain it.”

physicist Pieter Mumm
Physicists including Pieter Mumm (shown) used the emiT detector they built at NIST to investigate any potential statistical imbalance between the two natural types of neutron decay.
Credit: emiT team
View hi-resolution image

An answer might be found by examining radioactivity in neutrons, which decay in two different ways that can be distinguished by a specially configured detector. Though all observations thus far have invariably shown these two ways occur with equal frequency in nature, finding a slight imbalance between the two would imply that nature favors conditions that would create a bit more matter than antimatter, resulting in the universe we recognize.

Mumm and his collaborators from several institutions used a detector at the NIST Center for Neutron Research to explore this aspect of neutron decay with greater sensitivity than was ever possible before. For the moment, the larger answer has eluded them—several years of observation and data analysis once again turned up no imbalance between the two decay paths. But the improved sensitivity of their approach means that they can severely limit some of the numerous theories about the universe’s matter-antimatter imbalance, and with future improvements to the detector, their approach may help constrain the possibilities far more dramatically.

“We have placed very tight constraints on what these theories can say,” Mumm says. “We have given theory something to work with. And if we can modify our detector successfully, we can envision limiting large classes of theories. It will help ensure the physics community avoids traveling down blind alleys.”

The research team also includes scientists from the University of Washington, the University of Michigan, the University of California at Berkeley, Lawrence Berkeley National Laboratory, Tulane University, the University of Notre Dame, Hamilton College and the University of North Carolina at Chapel Hill. Funding was provided by the U.S. Department of Energy and the National Science Foundation.

* H.P. Mumm, T.E. Chupp, R.L. Cooper, K.P. Coulter, S.J. Freedman, B.K. Fujikawa, A. García, G.L. Jones, J.S. Nico, A.K. Thompson, C.A. Trull, J.F. Wilkerson and F.E. Wietfeldt. New limit on time-reversal violation in beta decay. Physical Review Letters, Vol. 107, Issue 10, DOI: 10.1103/PhysRevLett.107.102301.

Media Contact: Chad Boutin, chad.boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

New Report Urges More Detailed Utility Metering to Improve Building Efficiency

A new interagency report recommends systematic consideration of new metering technologies, called submetering, that can yield up-to-date, finely grained snapshots of energy and water usage in commercial and residential buildings to guide efficiency improvements and capture the advantages of a modernized electric power grid.

Commercial and residential buildings consume vast amounts of energy, water, and material resources. In fact, U.S. buildings account for more than 40 percent of total U.S. energy consumption, including 72 percent of electricity use. If current trends continue, buildings worldwide will be the largest consumer of global energy by 2025. By 2050, buildings are likely to use as much energy as the transportation and industrial sectors combined.

Submetering is the use of metering devices to measure actual energy or water consumption at points downstream from the primary utility meter on a campus or building. Submetering allows building owners to monitor energy or water usage for individual tenants, departments, pieces of equipment or other loads to account for their specific usage. Submetering technologies enable building owners to optimize design and retrofit strategies to energy and water management procedures more efficient and effective.

While the return on investment (ROI) for submeters depends on specific energy-efficiency strategies that may vary by climate, building type, and other factors, "numerous case studies provide evidence that the ROI can be significant,” concludes the report,Submetering of Building Energy and Water Usage: Analysis and Recommendations of the Subcommittee on Buildings Technology Research and Development. Installing submetering technology also makes possible the use of more advanced conservation technologies in the future, the report notes.

The report is a product of the Buildings Technology Research and Development Subcommittee of the National Science and Technology Council (NSTC), a cabinet-level council that is the principal means within the executive branch to coordinate science and technology policy across the diverse entities that make up the federal research and development enterprise.

The NSTC report provides an overview of the key elements of submetering and associated energy management systems to foster understanding of associated benefits and complexities. It documents the current state of submetering and provides relevant case studies and preliminary findings relating to submetering system costs and ROI. The report also addresses gaps, challenges and barriers to widespread acceptance along with descriptive candidate areas where additional development or progress is required. It also surveys policy options for changing current buildings-sector practices.

The 74-page report can be downloaded from: www.bfrl.nist.gov/buildingtechnology/documents/SubmeteringEnergyWaterUsageOct2011.pdf .

For more details, see the Nov. 8, 2011 announcement, “Government Issues Building Energy and Water Submetering Report” at www.nist.gov/el/submetering.cfm.

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

Comment  Comment on this article.back to top

NICE Issues Cybersecurity Workforce Framework for Public Comment

The National Initiative on Cybersecurity Education (NICE) has published for public comment a draft document that classifies the typical duties and skill requirements of cybersecurity workers. The document is meant to define professional requirements in cybersecurity, much as other professions, such as medicine and law, have done.

NICE is an interagency effort coordinated by the National Institute of Standards and Technology (NIST) and focused on cybersecurity awareness, education, training and professional development. NICE activities include increasing cybersecurity awareness for children and adults of all ages, promoting community college and university-level programs in cybersecurity, and expanding professional training opportunities.

The new document, the NICE Cybersecurity Workforce Framework, was created by the NICE group responsible for creating and maintaining a highly skilled workforce to meet the nation’s computer security needs. Over 20 participating agencies contributed to the group’s efforts.

“One thing NICE has found is that there has not been a consistent way to define or describe cybersecurity work across the federal workforce,” says NICE Lead Ernest McDuffie. Cybersecurity professionals previously have not fit into the standard occupations, job titles, position descriptions and the federal job classification and job grading system managed by the Office of Personnel Management (OPM).

Not having a common language to discuss and understand the work and skill requirements hinders federal employers in setting basic requirements, identifying skill gaps and providing training and professional development opportunities for their workforce. “Other professions have organized their specialties, and now it is time for a common set of definitions for the cybersecurity workforce,” said McDuffie.

The NICE Cybersecurity Workforce Framework provides a working taxonomy, or vocabulary, that is designed to fit into any organization’s existing occupational structure. The framework is based on information gathered from federal agencies through two years of surveys and workshops by OPM, a major Department of Defense study of the cybersecurity workforce and a study by the Federal CIO Council.

In opening the draft document up for public comment, NICE hopes to refine the framework so that it can be useful in both the public and private sectors to better protect the nation from escalating cybersecurity threats. Authors also want the framework to address emerging work requirements to help ensure the nation has the skills to meet them. The authors are requesting input from all of the nation’s cybersecurity stakeholders including academia, professionals, not-for-profit organizations and private industry.

The framework organizes cybersecurity work into high-level categories ranging from the design, operation and maintenance of cybersecurity systems to incident response, information gathering and analysis. The structure is based on job analyses and groups together work and workers that share common major functions, regardless of job title.

To read the document and provide comments, go to http://niccs.us-cert.gov/research/national-cybersecurity-workforce-framework. The webpage also provides a template for comments, which are due Dec. 16, 2011.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

Draft Roadmap for Cloud Computing Technology

The National Institute of Standards and Technology (NIST) has released for public comment a draft "roadmap" that is designed to foster federal agencies' adoption of cloud computing, support the private sector, improve the information available to decision makers and facilitate the continued development of the cloud computing model.

NIST plans to issue the final U.S. Government Cloud Computing Roadmap as a three-volume work. The first two volumes were posted for public comment on Nov. 1, 2011.

The draft publication defines high-priority requirements for standards, official guidance and technology developments that need to be met in order for agencies to accelerate their migration of existing IT systems to the cloud computing model.

Volume I, High-Priority Requirements to Further USG Agency Cloud Computing Adoption, provides a general understanding and overview of the roadmap initiative, including:

  • prioritized interoperability, portability and security requirements that must be met to further government cloud adoption;
  • standards, guidelines and technology that must be in place to satisfy these requirements; and,
  • a list of Priority Action Plans (PAPs) recommended for voluntary self-tasking by the cloud stakeholder community to support standards, guidelines and technology development.

Volume II, Useful Information for Cloud Adopters, is the nuts and bolts publication. It is a technical reference that is useful for those working on strategic and tactical cloud computing initiatives—whether they work in government agencies or not. Volume II integrates and summarizes the work completed to date, explains the assessment findings based on this work and how these findings support the roadmap introduced in Volume I.

The third volume, Technical Considerations for USG Cloud Computing Deployment Decisions, is under development as part of an interagency and public working group collaborative effort. It is intended as a guide for decision makers who are planning and implementing cloud computing solutions.

Much of the work that forms the basis for the roadmap has been completed through public working groups open to interested parties from industry, academia and government. Hundreds of people are registered in the five NIST Cloud Computing Working Groups that were established in November 2010. The working groups also contributed to the content of two related cloud publications released earlier this year—NIST Cloud Computing Standards Roadmap (SP 500-291) and NIST Cloud Computing Reference Architecture (NIST SP 500-292).

Volumes I and II of U.S. Government Cloud Computing Technology Roadmap, Release 1.0 (SP 500-293) can be retrieved, along with the Technical Considerations for USG Cloud Computing Deployment Decisions working document, which will eventually be released as the third volume of SP 500-293, and other cloud publications at www.nist.gov/itl/cloud/index.cfm.

For more details, see the NIST Nov. 1, 2011, announcement, “NIST Releases Draft Cloud Computing Technology Roadmap for Comments” at www.nist.gov/itl/csd/cloud-110111.cfm. Comments on the first two volumes are due by 5 p.m. Eastern time Dec. 2, 2011. Electronic comments should be sent to ccroadmap.comments@nist.gov or written ones can be mailed to Robert Bohn, NIST, 100 Bureau Dr., Stop 2000, Gaithersburg, MD 20899-2000.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST Signs Agreement to Enhance Cybersecurity Education Programs

The National Institute of Standards and Technology (NIST) has agreed to work with the Department of Education and a new organization, the National Cybersecurity Education Council (NCEC), to develop a strategic public-private partnership to promote formal cybersecurity education.

The plan, outlined in a recent Memorandum of Understanding (MOU), is designed to help the National Initiative for Cybersecurity Education (NICE) meet one of its top priorities, to “broaden the pool of skilled workers capable of supporting a cyber-secure nation.”

NICE is coordinated by NIST, and the program’s goals are to enhance the nation’s cybersecurity through improving awareness, formal education, workforce structure, and workforce training and professional development in cybersecurity.

NIST, the Department of Education and NCEC agreed to work together with their state and local counterparts to enhance formal cybersecurity education programs for kindergarten through 12th grade, higher education and vocational programs to provide skilled cybersecurity workers for both the private sector and government. They will cooperate on the development of innovative cybersecurity education programs, share ideas and concepts to enhance cybersecurity programs and engage in cooperative program planning and development.

The idea for this partnership was conceived at the first NICE Workshop in August 2010. Cybersecurity groups, including the National Cyber Security Alliance (NCSA) and Cyberwatch ran with the idea, and now NCEC comprises 67 organizations, industry, academia and individuals—including many of the country’s largest computer organizations.

“This MOU is the first example of public-private partnerships that NICE will use to enhance and extend its reach and effectiveness across the country,” said NICE Lead Ernest McDuffie. “Such partnerships are a key strategy for the future operations of NICE.”

NCEC first’s task under the MOU is to build a comprehensive baseline listing of formal cybersecurity education activities operating across the country. This will be a valuable aid to NICE and others interested in identifying potential partners and in addressing any unmet needs.

Other federal agencies and organizations are welcome to join the effort. Interested groups should contact NICE Lead Ernest McDuffie at emcduffie@nist.gov. To learn more about NICE, see the website http://csrc.nist.gov/nice; about NCEC, contact the NCSA at http://www.staysafeonline.org/.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST to Help Speed Technology Transfer from Federal Labs

The National Institute of Standards and Technology (NIST) will play a significant role in helping speed the transfer of federal research and development from the laboratory to the marketplace, as part of a plan laid out in a Presidential Memorandum issued on Oct. 28, 2011. The goal is to help U.S. businesses create jobs and strengthen their competitiveness in a global economy.

The Presidential Memorandum caps an extensive effort by the White House Office of Science and Technology Policy, and widely supported by the Department of Commerce and other agencies, to accelerate innovation. This effort has already led to advances to encourage entrepreneurship from the private sector and to marshal the investment in federal laboratory research to help America innovate and compete.

“NIST is in a good position to lead these efforts because we are a federal lab ourselves, but also because of our experience coordinating technology transfer efforts and in establishing metrics to evaluate their effectiveness,” said Phillip Singerman, NIST’s Associate Director for Innovation and Industry Services

Through its existing role coordinating the Interagency Workgroup on Technology Transfer, NIST will help lead agencies with federal laboratories to develop plans that establish performance goals to increase the number and pace of effective technology transfer and commercialization activities in partnership with nonfederal organizations. The group also will be responsible for recommending opportunities to improve technology transfer from federal labs and for refining how tech transfer is defined, to better capture data on all of the ways it happens.

NIST will coordinate development and analysis of appropriate metrics and will continue to report and analyze results through its annual report on technology transfer, which covers 11 federal agencies. “We plan to improve and expand the collection of metrics to better measure the commercial impact of federal technology transfer,” said Singerman.

Singerman points to examples of NIST’s own success in tech transfer as supporting the decision to have the organization lead this important effort. For example, in 2010 NIST licensed a technology called RoboCrane, which is now being used to build a new confinement structure at the Chernobyl Nuclear Power Plant. Another NIST research project, this one in refrigeration technology, has enabled the emergence of patented cryosurgical instruments for treating heart arrhythmias and uterine conditions, and generated millions of dollars in revenue for the licensee.

NIST has also helped architects, engineers and the construction industry select environmentally preferred and cost-effective products through free downloadable software for analysis and selection of building materials. Building for Environmental and Economic Sustainability (BEES) includes performance data across a wide range of applications for more than 230 products, and it won the 2010 GreenGov Presidential Award from the White House Council on Environmental Quality.

The full text of the White House press release on the Presidential Memorandum can be read at www.whitehouse.gov/the-press-office/2011/10/28/we-cant-wait-obama-administration-announces-two-steps-help-businesses-cr.

See the most recent “Federal Laboratory Technology Transfer Summary Reports” at www.nist.gov/tpo/publications/index.cfm. Read more about RoboCrane in the Aug. 17, 2010, issue of Tech Beat, “NIST Technology Called Upon to Clean Up Chernobyl Disaster Site” at www.nist.gov/public_affairs/tech-beat/tb20100817.cfm#robocrane. Learn more about 2010 GreenGov Presidential Award winner BEES at www.nist.gov/el/lippiatt_101310.cfm.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top