NIST logo

Tech Beat - July 1, 2014

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: July 1, 2014
Date Modified: July 1, 2014 
Contact: inquiries@nist.gov

NIST Test House Exceeds Goal; Ends Year with Energy to Spare

The net-zero energy test house at the National Institute of Standards and Technology (NIST) in suburban Washington, D.C., not only absorbed winter's best shot, it came out on top, reaching its one-year anniversary on July 1 with enough surplus energy to power an electric car for about 1,440 miles.*

Despite five months of below-average temperatures and twice the normal amount of snowfall, NIST's Net-Zero Energy Residential Test Facility (NZERTF) ended its one-year test run with 491 kilowatt hours of extra energy. Instead of paying almost $4,400 for electricity—the estimated average annual bill for a comparable modern home in Maryland—the virtual family of four residing in the all-electric test house actually earned a credit by exporting the surplus energy to the local utility.

"We made it—and by a convincing margin," said Hunter Fanney, the mechanical engineer who leads NZERTF-based research. "From here on in, our job will be to develop tests and measurements that will help to improve the energy efficiency of the nation's housing stock and support the development and adoption of cost-effective, net-zero energy designs and technologies, construction methods and building codes."

A net-zero energy house produces at least as much energy as it consumes over the course of a year. A number of states are taking steps toward encouraging or even requiring construction of net-zero energy homes in the future. For example, California will require that, as of 2020, all newly constructed homes must be net-zero energy ready.

Both a laboratory and a house, the two-story, four-bedroom, three-bath NZERTF would blend in nicely in a new suburban subdivision. But it was designed and built to be about 60 percent more energy efficient than houses built to meet the requirements of the 2012 version of the International Energy Conservation Code, which Maryland has adopted.

The 2,700 square-foot (252-square-meter) test house is built to U.S. Green Building Council LEED Platinum standards—the highest standard for sustainable structures. Its features include energy-efficient construction and appliances, as well as energy-generating technologies, such as solar water heating and a solar photovoltaic system.

net zero house in the sow
Despite a harsh winter that left the Net-Zero Energy Residential Test Facility's photovoltaic and solar thermal panels covered with snow on 38 days, the energy-efficient house produced more energy than it used over the course of a year.
Credit: NIST
View hi-resolution image

Despite 38 days when the test house's solar panels were covered with snow or ice, the NZERTF's sun-powered generation system produced 13,577 kilowatt hours of energy. That's 491 kilowatt hours more than used by the house and its occupants, a computer-simulated family of two working parents and two children, ages 8 and 14.

First year energy use totaled 13,086 kilowatt hours, which was about 3,000 kilowatt hours more than projected usage in a year with typical weather. In a normal year, a comparable home built to meet Maryland's residential energy standard would consume almost 27,000 kilowatt hours of energy.

In terms of energy consumed per unit of living space—a measure of energy-use intensity—the NIST test house is calculated to be almost 70 percent more efficient than the average house in Washington, D.C., and nearby states.

From July through October, the facility registered monthly surpluses. In November, when space-heating demands increased and the declining angle of the sun reduced the energy output of its 32 solar panels, the NZERTF began running monthly deficits. Through March 31, when the house's net energy deficit plummeted to 1,800 kilowatt hours—roughly equivalent to the combined amount of energy a refrigerator and clothes dryer would use in a year—temperatures consistently averaged below normal.

Starting in April, the energy tide began to turn as the house began to export electric power to the grid on most days.

"The most important difference between this home and a Maryland code-compliant home is the improvement in the thermal envelope—the insulation and air barrier," says NIST mechanical engineer Mark Davis. By nearly eliminating the unintended air infiltration and doubling the insulation level in the walls and roof, the heating and cooling load was decreased dramatically.

In terms of cost, the NZERTF's virtual residents saved $4,373 in electricity payments, or $364 a month. However, front-end costs for solar panels, added insulation, triple-paned windows, and other technologies and upgrades aimed at achieving net-zero energy performance are sizable, according to an analysis by NIST economist Joshua Kneifel.**

In all, Kneifel estimates that incorporating all of the NZERTF's energy-related technologies and efficiency-enhancing construction improvements would add about $162,700 to the price of a similar house built to comply with Maryland's state building code.

Planned measurement-related research at the NZERTF will yield knowledge and tools to help trim this cost difference. Results also will be helpful in identifying affordable measures that will be most effective in reducing energy consumption. And research will further the development of tests and standards that are reliable benchmarks of energy efficiency and environmental performance overall, providing information useful to builders, home buyers, regulators and others.

* An electric car gets 2.94 miles per kilowatt hour, according to the Environmental Protection Agency (2012).
** J.D. Kneifel. Life-Cycle Cost Comparison of the NIST Net Zero Energy Residential Test Facility to a Maryland Code-Compliant Design. NIST Special Publication 1172, June 2014.

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

back to top

Call for Comments: A Strategic Plan for the Materials Genome Initiative

The federal government is looking for public comment on a draft strategic plan for the Obama administration's Materials Genome Initiative (MGI). The draft 2014 Materials Genome Initiative Strategic Plan lays out a multi-year program to address four key challenges to the MGI vision for doing materials research.

model of crystal plasticity
Model of crystal plasticity studied at NIST. The goal of this work ultimately is to be able to predict in advance the stress and strain behavior of complex, polycrystalline materials under development.
Credit: Ma/NIST
high resolution image

Launched by President Obama in 2011, the MGI is a program that links government research agencies with industry and academic institutions in an effort to establish a new paradigm for creating advanced materials that meet the needs of a broad range of industries, from defense, aerospace, transportation and telecommunications to nano- and biotechnology. A central goal of the MGI is to halve the time required to move new advanced materials from laboratory discovery to commercial use.

The main emphasis of the MGI is to replace the standard trial-and-error approach to the design of new high-performance materials—alloys, composites, nanostructured materials—with a rational design approach based on multidisciplinary research, theory, computer models and vast data sets.

The draft MGI strategic plan highlights four specific challenges:

  • Leading a culture shift in materials research to encourage and facilitate an integrated team approach that links computation, data and experiment and crosses boundaries from academia to industry;
  • Integrating experiment, computation and theory and equipping the materials community with the advanced tools and techniques to work across materials classes from research to industrial application;
  • Making digital data accessible, including combining data from experiment and computation into a searchable materials data infrastructure and encouraging researchers to make their data available to others; and
  • Creating a world-class materials workforce that is trained for careers in academia or industry, including high-tech manufacturing jobs.

The draft MGI strategic plan, which includes proposed courses of action and milestones to meet each challenge, is available at www.nist.gov/mgi/upload/MGI-StrategicPlan-2014.pdf. A Federal Register notice detailing the call for public comments and how to file comments is available at https://federalregister.gov/a/2014-14392. Comments are due by July 21, 2014.

The federal portion of the MGI is managed by the National Science and Technology Council (NSTC), which coordinates science and technology efforts across the federal government. Federal participants in the MGI include the Departments of Defense and Energy, the National Science Foundation (NSF) and the Commerce Department's National Institute of Standards and Technology (NIST).

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top

Seeing Your True Colors: Standards for Hyperspectral Imaging

Today, doctors who really want to see if a wound is healing have to do a biopsy or some other invasive technique that, besides injuring an already injured patient, can really only offer information about a small area. But a technology called hyperspectral imaging offers doctors a noninvasive, painless way to discriminate between healthy and diseased tissue and reveal how well damaged tissue is healing over a wide area. The catch? A lack of calibration standards is impeding its use.

skin research
NIST researchers are gathering skin reflectance data to establish the variation found in human tissue in order to develop reference standards for hyperspectral imaging applications. The top image shows skin as normally viewed. At bottom are the same images with enhanced contrast in false color to show the variability between subjects.
Credit: Cooksey, Allen/NIST
View hi-resolution image

After a successful non-human trial, researchers at the National Institute of Standards and Technology (NIST) have started gathering data on how human skin looks under various wavelengths of light in order to develop these badly needed standards.*

Unlike consumer digital cameras and the human eye, which only see red, green and blue light, a relatively narrow portion of the electromagnetic spectrum, each pixel of a hyperspectral image captures information for hundreds of narrow spectral bands—from the ultraviolet to the infrared.

According to NIST researcher David Allen, being sensitive to so many wavelengths means hyperspectral imagers can see many different things that humans can't see, including the amount of oxygen in human tissues, an indicator of healing.

"The potential of the technology has been proven, but the problem is that researchers are simply lacking a way to assure consistent results between labs," says Allen. "Standards development has itself been hindered by a lack of human skin reflectance data, especially in the ultraviolet and short-wave infrared."

Catherine Cooksey, the project leader for the spectrophotometry program that establishes and maintains the national scale of reflectance, says that before we delve into what diseased tissue looks like hyperspectrally, we need to know what so-called "normal" tissue looks like. Furthermore, she says that they are looking to quantify the variability both within an individual and between individuals due to inherent biological differences. The initial NIST studies used 28 volunteer test subjects. The data collected included a photograph of the test area on the subject's forearm and three reflectance measurements of the test area.

"Skin reflectance varies due to skin pigmentation, tissue density, lipid content and blood volume changes," says Cooksey. "And few, if any, studies of skin reflectance have been done with an estimated measurement uncertainty that is traceable to NIST or any other national metrology institute. We need good data from a wide variety of sources, and for that we need the help of our colleagues in the community."

Once they collect enough data, the NIST researchers can feed it into NIST's Hyperspectral Image Projector, a device that creates hyperspectral scenes that have all the spectral signatures of the real thing—in this case, tissue in various stages of repair. Medical imaging technicians can then use these "digital tissue phantoms" to test their imagers' ability to discern among and detect different tissue types and conditions.

Those interested in helping to gather skin reflectance data should contact Allen (dwallen@nist.gov) or Cooksey (catherine.cooksey@nist.gov) for more information.

*C.C. Cooksey, B.K. Tsai and D.W. Allen. "A collection and statistical analysis of skin reflectance signatures for inherent variability over the 250 nm to 2500 nm spectral range." Presented at the SPIE Defense, Security & Sensing Conference, June 4, 2014, Baltimore, Md.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

back to top

New NIST Metamaterial Gives Light a One-Way Ticket

The light-warping structures known as metamaterials have a new trick in their ever-expanding repertoire. Researchers at the National Institute of Standards and Technology (NIST) have built a silver, glass and chromium nanostructure that can all but stop visible light cold in one direction while giving it a pass in the other.* The device could someday play a role in optical information processing and in novel biosensing devices.

metamaterial 
Schematic of NIST's one-way metamaterial. Forward travelling green light (left) or red light passes through the multilayered block and comes out at an angle due to diffraction off of grates on the surface of the material. Light travelling in the opposite direction (right) is almost completely filtered by the metamaterial and can't pass through.
Credit: Xu/NIST
View hi-resolution image

In recent years, scientists have designed nanostructured materials that allow microwave or infrared light to propagate in only one direction. Such structures hold potential for applications in optical communication—for instance, they could be integrated into photonic chips that split or combine signals carried by light waves. But, until now, no one had achieved one-way transmission of visible light, because existing devices could not be fabricated at scales small enough to manipulate visible light's short wavelengths. (So-called "one-way mirrors" don't really do this—they play tricks with relative light levels.)

To get around that roadblock, NIST researchers Ting Xu and Henri Lezec combined two light-manipulating nanostructures: a multi-layered block of alternating silver and glass sheets and metal grates with very narrow spacings.

The silver-glass structure is an example of a "hyperbolic" metamaterial, which treats light differently depending on which direction the waves are traveling. Because the structure's layers are only tens of nanometers thick—much thinner than visible light's 400 to 700 nanometer wavelengths—the block is opaque to visible light coming in from outside. Light can, however, propagate inside the material within a narrow range of angles.

Xu and Lezec used thin-film deposition techniques at the NIST NanoFab user facility to build a hyperbolic metamaterial block.Guided by computer simulations, they fabricated the block out of 20 extremely thin alternating layers of silicon dioxide glass and silver. To coax external light into the layered material, the researchers added to the block a set of chromium grates with narrow, sub-wavelength spacings chosen to bend incoming red or green light waves just enough to propagate inside the block. On the other side of the block, the researchers added another set of grates to kick light back out of the structure, although angled away from its original direction.

While the second set of grates let light escape the material, their spacing was slightly different from that of the first grates. As a result, the reverse-direction grates bent incoming light either too much or not enough to propagate inside the silver-glass layers. Testing their structures, the researchers found that around 30 times more light passed through in the forward direction than in reverse, a contrast larger than any other achieved thus far with visible light.

Combining materials that could be made using existing methods was the key to achieving one-way transmission of visible light, Lezec says. Without the intervening silver-and-glass blocks, the grates would have needed to be fabricated and aligned more precisely than is possible with current techniques. "This three-step process actually relaxes the fabrication constraints," Lezec says.

In the future, the new structure could be integrated into photonic chips that process information with light instead of electricity. Lezec thinks the device also could be used to detect tiny particles for biosensing applications. Like the chrome grates, nanoscale particles also can deflect light to angles steep enough to travel through the hyperbolic material and come out the other side, where the light would be collected by a detector. Xu has run simulations suggesting such a scheme could provide high-contrast particle detection and is hoping to test the idea soon. "I think it's a cool device where you would be able to sense the presence of a very small particle on the surface through a dramatic change in light transmission," says Lezec.

*T. Xu and H.J. Lezec. Visible-frequency asymmetric transmission devices incorporating a hyperbolic metamaterial. Nature Communications. 2014, 5, DOI: 10.1038/ncomms5141. www.nature.com/ncomms/2014/140617/ncomms5141/full/ncomms5141.html

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

back to top

NIST to Award up to $2.5 Million for Business-to-Business Matching Systems

Through 25 years of working with small and mid-size U.S. manufacturers, the National Institute of Standards and Technology (NIST) Hollings Manufacturing Extension Partnership (MEP) centers across the country have seen that these companies often need help finding technologies and timely business opportunities, as well as making potential customers aware of their capabilities. To help meet those needs, NIST is offering up to $2.5 million in grants to MEP centers for pilot business and technology matching systems.

"MEP centers provide critical services and resources to manufacturers nationwide. The NIST funding announced today will boost MEP's efforts to help companies connect to the right technologies, customers and suppliers, which will help grow their businesses and strengthen the U.S. manufacturing sector," said U.S. Secretary of Commerce Penny Pritzker.

NIST expects that the pilot Business-to-Business, or B2B, projects will emphasize people-based networks, backed up by software systems that actively track opportunities and get to know client needs. The systems should include flexible IT platforms, a sustainable business model for network transactions, and robust content management and education efforts.

The B2B Network pilots must include an approach that provides for easy assignment of outcomes such as projects started, connections made, commercialization and sales. It should also provide matchmakers with current, accurate information so they can have relative confidence in the value of opportunities, as well as metrics upon which to evaluate network performance and efficacy.

NIST expects to award up to 10 cooperative agreements, with funding up to $250,000 for up to a two-year performance period.

Full details of the pilot program were posted to Grants.gov on June 20, 2014, under Funding Opportunity Number 2014-NIST-MEP-B2BN-012. Additional information is available on the MEP website at www.nist.gov/mep/ffo_b2b.cfm. Applications are due to NIST by August 4, 2014.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

back to top

New York Area Disaster Resilience Workshop Set for July 30, 2014

The second in a series of regional workshops devoted to developing a community-centric "disaster resilience framework" to minimize the impacts of hazards and quickly restore vital functions and services in the aftermath of disasters will be held July 30, 2014, at the Stevens Institute of Technology in Hoboken, N.J.

The workshop is sponsored by the National Institute of Standards and Technology (NIST). As part of President Obama's Climate Action Plan, NIST is leading a collaborative nationwide effort to develop a framework that U.S. communities can use to prepare for, resist, respond to, and recover from hazard events more rapidly and at a lower cost.

The workshop will begin with a session on resilience lessons learned from Hurricane Sandy, the 2012 "superstorm" that affected many states along the Atlantic seaboard. Sandy killed more than 150 people, caused an estimated $65 billion in damage, and left millions without power for extended periods. The devastation also underscored the complex web of interdependencies and vulnerabilities of buildings and infrastructure systems.

In breakout sessions, participants will help to develop sections of the framework, which will focus on communities, buildings, and infrastructure lifelines. Topics will include buildings and facilities, transportation systems, energy systems, communication and information systems, water and wastewater systems, and social vulnerabilities.

NIST seeks input from a broad array of stakeholders, including planners, designers, facility owners and users, government officials, utility owners, regulators, standards and model code developers, insurers, trade and professional associations, disaster response and recovery groups, and researchers.

The disaster resilience framework will establish overall performance goals; assess existing standards, codes, and practices; and identify gaps that must be addressed to bolster community resilience. NIST will incorporate input from this workshop and two others into an initial draft of the framework, which will be issued for public comment in April 2015.

The workshop registration fee is $50. To access the registration site and to view the agenda, go to: www.nist.gov/el/building_materials/resilience/2nd-disaster-resilience-workshop.cfm

To learn more about NIST's Disaster Resilience Program, go to: www.nist.gov/el/building_materials/resilience/

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

back to top

The Vivid Realism of 3D Functions, Right There on Your Browser

A classic online mathematical reference offered by the National Institute of Standards and Technology (NIST) now features a better way for users to view its most complicated illustrations—three-dimensional graphs of math functions that are so complex they call mountain ranges to mind.

DLMF 
The DLMF contains hundreds of interactive 3-D visualizations of complex functions, which now can be rotated and manipulated on most web browsers without the need for a special plugin.
Credit: Antonishek/NIST

The Digital Library of Mathematical Functions (DLMF), which was released online in 2010*, contains hundreds of function graphs, nearly 200 of which are available as interactive 3-D visualizations of complex functions. Like mountain ranges, these visualizations can be hard to appreciate fully from a single vantage point; it helps to look at one from lots of angles to see how all its peaks and valleys fit together. And the DLMF allows readers to do so, by rotating them and zooming in.

But up to now, it required a special browser plugin to manipulate them. No more. A NIST team has converted these complex functions to an alternative format called WebGL, a JavaScript utility which allows a user to view a 3D graphics scene directly in any compliant web browser without using a plugin. The two-year effort also improved the overall quality of the visualizations. New features include a "bottom" viewpoint, and technical improvements to an innovative DLMF feature that allows users to slice through the 3D graph with a plane in order to visualize the contour of the plane's intersection with the graph.

"Today most major browsers are WebGL enabled on Windows, Mac and Linux platforms," says Bonita Saunders, the DLMF's Visualization Editor. "However, we will continue to offer options for viewing the DLMF visualizations using plugins until we are more confident about WebGL support across most browsers and platforms."

Saunders says the team continues to consider new ways to enhance the DLMF experience based on user feedback and that in the future they hope to add additional features—such as the ability to make a visualization spin continuously.

*See the 2010 Tech Beat story, "NIST Releases Successor to Venerable Handbook of Math Functions" at http://www.nist.gov/public_affairs/tech-beat/tb20100511.cfm#math.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

back to top

NIST Names Members to First Forensic Science Standards Board

cartridge case mark
Forensic firearms and toolmarks: Members of the new Forensic Science Standards Board will coordinate development of consensus standards by committees dedicated to various forensic science disciplines, including firearms and toolmarks. Impressions made on the surface of a cartridge case when a gun is fired can act like fingerprints to identify a specific firearm.
Credit: NIST
View hi-resolution image

As part of its efforts to improve the scientific basis of forensic evidence used in courts of law, the National Institute of Standards and Technology (NIST) and the Department of Justice (DOJ) have made the first appointments to a new organization dedicated to identifying and fostering development and adoption of standards and guidelines for the nation's forensic science community.

NIST and DOJ named 17 academic researchers and forensic science experts to the Forensic Science Standards Board (FSSB), a key component of NIST's Organization of Scientific Area Committees (OSAC), which is bringing a uniform structure to what was previously an ad hoc system.

"The appointments to the Forensic Science Standards Board essentially mark a transition from planning to doing," said NIST Acting Director Willie May. "After months of collaboration with the forensic science community, we are bringing to life this new organization that will have a positive impact on the practice of forensic science in the United States."

The board will oversee three resource committees and five scientific area committees. Subcommittees will focus on specific disciplines, including DNA, toxicology, medico-legal death investigation, facial identification, latent fingerprints and firearms and toolmarks, among others. The subcommittees will propose consensus documentary standards, for adoption by the board, to improve quality and consistency of work in the forensic science community.

Read more …

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

back to top

New Math Technique Improves Atomic Property Predictions to Historic Accuracy

[This article was revised on July 23, 2014, to clarify the relationship between the work reported here and the related problem of calculating relativistic and quantum effects on electron energy levels. The latter becomes a significant factor for the higher atomic number atoms. - Editor]

By combining advanced mathematics with high-performance computing, scientists at the National Institute of Standards and Technology(NIST) and Indiana University (IU) have developed a tool that allowed them to calculate a fundamental property of a number of atoms on the periodic table to historic accuracy—reducing error by a factor of a thousand in many cases. Computational techniques of this type could be used someday to determine a host of other atomic properties important in fields like nuclear medicine and astrophysics.*

nuclear medicine
Computational techniques developed by a team from NIST and IU could enable precise computation of atomic properties that are important for nuclear medicine, as well as astrophysics and other fields of atomic research.
Credit: © Paco Ayala (Fotolia)

NIST's James Sims and IU's Stanley Hagstrom have calculated the nonrelativistic base energy levels for the four electrons in the element beryllium as well as the positive ions of all the other elements having the same number of electrons as beryllium, an accomplishment that has required nearly an entire career's effort on Sims' part. (Electron energy levels also depend on relativity and quantum dynamics effects caused by the atom's nucleus, but they're negligible until you get to much larger atoms.) Precise determination of the base energy—crucial for determining the amount necessary to raise an atom from its base energy level to any level higher—has great intrinsic value for fundamental atomic research, but the team's technique has implications far broader than for a single element.

Sims says the technique allowed the calculation of energy levels with eight-decimal accuracy, resulting in a remarkably smooth curve that they expected theoretically but were not sure they would attain in practice. For the vast majority of the elements in the periodic table, the calculated results are a thousand times more accurate than previous values which have been reported for the nonrelativistic model. The results, according to Sims, suggest their method could enable computation of other atomic properties—electron affinity and ionization potential, for example—that are important for astrophysics and other fields of atomic research.

Their method is the culmination of decades of effort aimed at using quantum mechanics to predict base energy levels from first principles. Sims first proposed in the late 1960s that such a quantum approach could be possible, but the complex calculations involved were beyond the reach of the world's best computers. Only in 2006, after the advent of parallel computing—linking many computers together as a unified cluster—were he and Hagstrom able to create workable algorithms for calculating the energies for a two-electron hydrogen molecule more accurately than could be done experimentally. Then, in 2010, they improved the algorithms to bring lithium's three electrons within reach.**

Beryllium's four electrons proved a new hurdle, but perhaps the last significant one. Much of the difficulty stems from the fact that mutual repulsion among the electrons, combined with their attraction for the nucleus, creates a complex set of interacting forces that are at least time-consuming, if not practically impossible, to calculate. The complexity grows with the addition of each new electron, but the team found a mathematical approach that can reduce an atom's electron cloud to a group of problems, none of which are more complex than solving a four-electron system.

Calling their approach a shortcut would be in some ways a misnomer. Where the calculation for lithium required a cluster of 32 parallel processors, beryllium required 256, and even then, the cluster needed to operate at extremely high efficiency for days. But the payoff was that they could calculate the energies for all four-electron ground states—meaning not only all of the elements in beryllium's column on the periodic table, each of which has four electrons in its outer shell, but also for all other elements in ionized states that have four electrons, such as boron with one electron missing, carbon missing two, and so forth. Relativistic and other effects are not included in the current model and become more significant for larger atomic numbers, but this study does demonstrate the importance of careful analysis and parallel computational approaches to enable "virtual measurement" of atomic properties based on theory, according to NIST researchers.

*J.A. Sims and S.A. Hagstrom. Hylleraas-configuration-interaction nonrelativistic energies for the 1S ground states of the beryllium isoelectronic sequence. Journal of Chemical Physics, DOI 10.1063/1.4881639, June 11, 2014.

**See the 2010 Tech Beat article, "Theorists Close In on Improved Atomic Property Predictions" at www.nist.gov/public_affairs/tech-beat/tb20100112.cfm#atomic.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

back to top

NIST Seeks Members for Three New Cloud Computing Working Groups

The National Institute of Standards and Technology (NIST) Cloud Computing Program (NCCP) is forming three public working groups to provide solutions to cloud computing challenges. A teleconference on Wednesday, June 25, 2014, at 11 a.m. Eastern will kick off the effort. Program leaders will discuss group goals, member roles and responsibilities, meeting schedules and deadlines.

The NCCP provides leadership and guidance to catalyze and facilitate the use of cloud computing within industry and government.

NIST's cloud computing public working groups bring together industry, government and academic experts from across the world to address requirements laid out in the NIST U.S. Government Cloud Computing Technology Roadmap, Release 1.0 (Draft).* The combined efforts of several public working groups drew up the roadmap and others are working on security, standards and accessibility.

Two of the key challenges for cloud computing flagged in the roadmap are interoperability— software and components working together—and portability—allowing data owners to easily move data from one cloud to another or change cloud vendor. The Interoperability and Portability for Cloud Computing working group will identify the issues and types of interoperability and portability needed for cloud computing systems and the relationships and interactions between interoperability and portability.

The Cloud Services group will address the roadmap's Requirement 4—clearly and consistently categorized cloud services. Cloud customers do not generally own the cloud hardware; instead they obtain services from a cloud provider. The cloud computing definition promulgated by NIST** categorizes cloud services as Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS). Recently, dozens of new types of cloud services and acronyms for them have popped up in the marketplace. The new public working group will use the NIST Cloud Computing Reference Architecture*** to provide consistent categories of cloud services so buyers know what they are committing to before signing potentially costly, long-term contracts.

In the future, organizations may use federated clouds: systems that access internal and external cloud resources from multiple providers to meet their business needs. The third public working group will tackle the roadmap's Requirement 5—Frameworks to support seamless implementation of federated community cloud environments. This group will define the term "federated cloud" and develop a path to its implementation.

For more information on participating in the new public working groups, including call-in numbers, go to http://www.nist.gov/itl/cloud/announcement-of-three-new-wg.cfm. For more on the NIST Cloud Computing Program see www.nist.gov/itl/cloud.

*L. Badger, D. Bernstein, R. Bohn, F. de Vaulx, M. Hogan, J. Mao, J. Messina, K. Mills. A. Sokol, J. Tong, F. Whiteside, D. Leaf. US Government Cloud Computing Technology Roadmap Volume 1, Release 1.0 (Draft). (Special Publication 500-293 Draft). November 2011. Available at www.nist.gov/itl/cloud/upload/SP_500_293_volumeI-2.pdf.
**P. Mell and T. Grance. The NIST Definition of Cloud Computing (Special Publication 800-145). September 2011. Available at www.nist.gov/manuscript-publication-search.cfm?pub_id=909616.
***F. Liu, J. Tong, J. Mao, R. Bohn, J. Messina, L. Badger, D. Leaf. NIST Cloud Computing Reference Architecture (Special Publication 500-292). September 2011. Available at www.nist.gov/manuscript-publication-search.cfm?pub_id=909505.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

back to top