NIST logo

Tech Beat - September 13, 2011

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: September 13, 2011
Date Modified: September 13, 2011 
Contact: inquiries@nist.gov

Shaping Up: Controlling a Stem Cell's Form Can Determine Its Fate

"Form follows function!" was the credo of early 20th century architects making design choices based on the intended use of the structure. Cell biologists may be turning that on its head. New research* by a team working at the National Institute of Standards and Technology (NIST) reinforces the idea that stem cells can be induced to develop into specific types of cells solely by controlling their shape. The results may be important to the design of materials to induce the regeneration of lost or damaged tissues in the body.

nanofibers
Bone-like cell growth on nanofibers: confocal microscope images detail the growth of a human bone marrow stromal cell (actin filaments in the cell "skeleton" are stained orange) on a nanofiber scaffold (green). The structure of thin fibers encourages stem cells to develop into the elongated, branched form characteristic of mature bone cells.
Credit: Tison, Simon/NIST
View hi-resolution image

Tissue engineering seeks to repair or re-grow damaged body tissues, often using some form of stem cells. Stem cells are basic repair units in the body that have the ability to develop into any of several different forms. The NIST experiments looked at primary human bone marrow stromal cells, adult stem cells that can be isolated from bone marrow and can "differentiate" into bone, fat or cartilage cells, depending.

"Depending on what?" is one of the key questions in tissue engineering. How do you ensure that the stem cells turn into the type you need? Chemical cues have been known to work in cases where researchers have identified the proper additives—a hormone in the case of bone cells. Other research has suggested that cell differentiation on flat surfaces can be controlled by patterning the surface to restrict the locations where growing cells can attach themselves.

The experiments at NIST are believed to be the first head-to-head comparison of five popular tissue scaffold designs to examine the effect of architecture alone on bone marrow cells without adding any biochemical supplements other than cell growth medium. The scaffolds, made of a biocompatible polymer, are meant to provide a temporary implant that gives cells a firm structure on which to grow and ultimately rebuild tissue. The experiment included structures made by leaching and foaming processes (resulting in microscopic structures looking like clumps of insect-eaten lettuce), freeform fabrication (like microscopic rods stacked in a crisscross pattern) and electrospun nanofibers (a random nest of thin fibers). Bone marrow stromal cells were cultured on each, then analyzed to see which were most effective at creating deposits of calcium—a telltale of bone cell activity. Microarray analysis also was used to determine patterns of gene expression for the cultured cells.

The results show that the stem cells will differentiate quite efficiently on the nanofiber scaffolds—even without any hormone additives—but not so on the other architectures. The distinction, says NIST biologist Carl Simon, Jr., seems to be shape. Mature bone cells are characteristically long and stringy with several extended branches. Of the five different scaffolds, only the nanofiber one, in effect, forces the cells to a similar shape, long and branched, as they try to find anchor points. Being in the shape of a bone cell seems to induce the cells to activate the genes that ultimately produce bone tissue.

"This suggests that a good strategy to design future scaffolds would be to take into account what shape you want to put the cells in," says Simon, adding, "That's kind of a tall order though, you'd have to understand a lot of stuff: how cell morphology influences cell behavior, and then how the three-dimensional structure can be used to control it." Despite the research still to be done on this method, the ability to physically direct cell differentiation by shape alone potentially would be simpler, cheaper and possibly safer than using biochemical supplements, he says.

The work was supported in part by the National Institute of Dental and Craniofacial Research, National Institutes of Health.

* G. Kumar, C.K. Tison, K. Chatterjee, P.S. Pine, J.H. McDaniel, M.L. Salit, M.F. Young and C.G. Simon, Jr. The determination of stem cell fate by 3D scaffold structures through the control of cell shape. Biomaterials (2011), doi:10.1016/j.biomaterials.2011.08.054.

Media Contact: Michael Baum, baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

Double Jeopardy: Building Codes May Underestimate Risks Due to Multiple Hazards

As large parts of the nation recover from nature's one-two punch—an earthquake followed by Hurricane Irene—building researchers from the National Institute of Standards and Technology (NIST) warn that a double whammy of seismic and wind hazards can increase the risk of structural damage to as much as twice the level implied in building codes.

wind zone map
seismic hazard map
Top: Wind zone map shows how the frequency and strength of extreme windstorms vary across the United States. Wind speeds in Zone IV (red), where the risk of extreme windstorms is greatest, can be as high as 250 miles per hour.

Bottom: National seismic hazards maps display earthquake ground motions for various probability levels across the United States. These maps are the basis for seismic design provisions of building codes, insurance rate structures, and land-use planning.
Credit: (Top) Federal Emergency Management Agency, (Bottom) U.S. Geological Survey
View hi-resolution top image | View hi-resolution bottom image

This is because current codes consider natural hazards individually, explains NIST's Dat Duthinh, a research structural engineer. So, if earthquakes rank as the top threat in a particular area, local codes require buildings to withstand a specified seismic load. In contrast, if hurricanes or tornadoes are the chief hazard, homes and buildings must be designed to resist loads up to an established maximum wind speed.

In a timely article published in the Journal of Structural Engineering,* Duthinh, NIST Fellow Emil Simiu and Chiara Crosti (now at the University of Rome) challenge this compartmentalized approach. They show that in areas prone to both seismic and wind hazards, such as South Carolina, the risk that design limits will be exceeded can be as much as twice the risk in regions where only one hazard occurs, even accounting for the fact that these multiple hazards almost never occur simultaneously. As a consequence, buildings designed to meet code requirements in these double-jeopardy locations "do not necessarily achieve the level of safety implied," the researchers write.

Simiu explains by analogy: a motorcycle racer who takes on a second job as a high-wire performer. "By adding this new occupation, the racer increases his risk of injury, even though the timing and nature of the injuries sustained in a motorcycle accident or in a high-wire mishap may differ," he says. "Understandably, an insurer would raise the premium on a personal injury policy to account for the higher level of risk."

The researchers developed a method to assess risks due to wind and earthquakes using a common metric of structural resistance. With a consistent measure (the maximum lateral deflection), the combined risk of failure can be compared to the risk that design limits will be exceeded in regions vulnerable to only one of the hazards, the basis for safety requirements specified in current building codes.

They demonstrate their approach on three different configurations of a 10-story steel-frame building. One of the configurations used so-called reduced beam sections (RBS) to connect girders to columns. RBS technology was developed after California's Northridge earthquake in 1994, which resulted in significant structural damage in new and old buildings due to unanticipated brittle fractures in frame connections. Shaped like a dog bone, tapered RBS connections made the frames more ductile—better able to deflect without breaking.

In this case study, the researchers found that RBS connections do not decrease the risk that a steel-frame building will exceed its design limit when used in a region exposed to high winds or a region exposed to high winds and earthquakes. Consequently, the risk of failure doubled under dual-hazard conditions, when those conditions are of similar severity. However, they note that RBS connections can decrease the risk that limits associated with seismic design will be exceeded during the structure's life.

The researchers are continuing to extend their methodology and are proposing modifications to building codes.

* C. Crosti, D. Duthinh and E. Simiu. Risk consistency and synergy in multi-hazard design. ASCE Journal of Structural Engineering. Vol. 3, No. 8, Aug. 2011.

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

Comment  Comment on this article.back to top

U.S., Europe Collaborating on Smart Grid Standards Development

Today, the U.S. Commerce Department's National Institute of Standards and Technology (NIST) and the European Union's (EU) Smart Grid Coordination Group (SG-CG) jointly announced their intention to work together on Smart Grid standards development, emphasizing common goals and areas of focus.

Both NIST and the SG-CG have mandates to coordinate the development of a standards framework for Smart Grids, which can unlock innovation in the electrical sector. The two organizations outlined areas for future collaboration in a joint white paper. The SG-CG represents three private-sector standards organizations: the European Committee for Standardization (CEN), the European Committee for Electrotechnical Standardization (CENELEC) and the European Telecommunications Standards Institute (ETSI).*

Smart Grids are next-generation electrical grids that attempt to predict and intelligently respond to the behavior and actions of all electric power users connected to it—suppliers, consumers and those that do both—in order to efficiently deliver reliable, economical and sustainable electricity services. The new collaboration is meant to ensure that Smart Grid standards on both continents have as much in common as possible, so that devices and systems that interact with these grids can be designed in similar fashion.

"While the potential benefits of Smart Grids are enormous, they can only be fully reached if we can all agree on global solutions," says Ralph Sporer, chairman of SG-CG. "It is promising to see that NIST and SG-CG will be supporting a number of common positions and areas of collaboration to ensure a consistent set of international standards."

Smart Grids are expected to ease the incorporation of renewable energy sources, energy saving devices and electric vehicles into the power system. Overall goals include the reduction of carbon emissions and security of supply. To promote this transformation, governments on both sides of the Atlantic have taken a number of actions in recent years, including the U.S. Energy Independence and Security Act of 2007 and the American Recovery and Reinvestment Act of 2009, and Europe's Directives 2009/72/EC and 2009/73/EC within the framework of the 3rd  Package for the Internal Energy Market. This legislative effort has translated into a number of standards initiatives like the NIST Framework and Roadmap for Smart Grid Interoperability Standards in the United States and a Smart Grid mandate in the EU.

The collaboration aims to harmonize these conceptual frameworks. It also will promote the regular exchange of information regarding such issues as:

  • Legislation, regulation and other policies underpinning NIST and SG-CG work
  • Respective work methods, work programs and time lines
  • Standardization deliverables
  • Testing and certification frameworks
  • Cybersecurity requirements and technologies


According to NIST's George Arnold, the National Coordinator for Smart Grid Interoperability in the United States, the many facets of Smart Grid development—spanning multiple sectors of the economy and a wide range of stakeholders—make the standardization effort anything but business as usual, but this collaboration will advance efforts in the long run.

"The need for integration of multiple technologies, the many international activities, and ever-changing technical solutions within a short time frame make standards development a challenging task for standards organizations worldwide," says Arnold, "But this collaboration should help make sure that no one reinvents the wheel."

Arnold adds that NIST's Smart Grid Interoperability Panel (SGIP) plans to draft a letter of intent outlining the specifics of the collaboration in the near future. The White Paper of NIST and SG-CG on Standardization of Smart Grids is available online at www.nist.gov//smartgrid/upload/eu-us-smartgrids-white-paper.pdf.

* Further information about CEN is available at http://www.cen.eu/cen/AboutUs/Pages/default.aspx; about CENELEC, at http://www.cenelec.eu/aboutcenelec/whoweare/index.html; and about ETSI, at http://www.etsi.org/WebSite/AboutETSI/AboutEtsi.aspx.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

Two New Publications Provide a Cloud Computing Standards Roadmap and Reference Architecture

The National Institute of Standards and Technology (NIST) has published two new documents on cloud computing: the first edition of a cloud computing standards roadmap and a cloud computing reference architecture and taxonomy. Together, the documents provide guidance to help understand cloud computing standards and categories of cloud services that can be used government-wide.

cloud computing diagram
Cloud computing involves five actors: consumer, provider, auditor, broker and carrier. This illustration shows the possible communication paths between them.
Credit: NIST
View hi-resolution image

These documents, along with others from NIST and NIST working groups, will be incorporated into the NIST U.S. Government Cloud Computing Technology Roadmap, expected to be published in November, 2011.

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of computing resources, including servers, data storage and applications and services. NIST is responsible for accelerating the federal government’s secure adoption of cloud computing by leading efforts to develop standards and guidelines in close consultation and collaboration with standards bodies, the private sector and other stakeholders, including federal agencies.

To produce the NIST Cloud Computing Standards Roadmap (NIST Special Publication 500-291), the NIST Cloud Computing Standards Working Group compiled an “Inventory of Standards Relevant to Cloud Computing” that will continue to be updated. The working group includes volunteer participants from industry, government and academia.

The working group categorized these standards for features such as security, portability and interoperability, and identified models, studies and use cases relevant to cloud computing. Many of the standards now being applied to cloud computing were developed for pre-cloud technologies such as web services and the Internet; others are being developed to specifically support cloud functions and requirements.

The working group found a number of gaps in available standards ranging from fundamental issues such as security and privacy protection to user interfaces and important business-oriented features. The group also identified standardization priorities for the federal government, particularly in areas such as security auditing and compliance, and identity and access management.

The NIST Standards Working Group Co-convener Michael Hogan said “NIST SP 500-291 encourages federal agencies to become involved with developing specific cloud computing standards projects that support their priorities in cloud computing services to move cloud computing standards forward.”

The publication also suggests that the federal government should recommend specific cloud computing standards and best practices for government-wide use. It can be downloaded from http://collaborate.nist.gov/twiki-cloud-computing/pub/CloudComputing/StandardsRoadmap/NIST_SP_500-291_Jul5A.pdf.

The guiding principles used to create the NIST Cloud Computing Reference Architecture (NIST SP 500-292) were to develop a vendor-neutral architecture, or design, consistent with the NIST cloud definition and to create a solution that does not stifle innovation by defining a prescribed technical solution. The resulting reference architecture and taxonomy (vocabulary) was developed as an actor/role-based system that lays out the central elements of cloud computing for federal chief information officers, procurement officials and IT program managers. Roles of the five cloud “actors”—consumer, provider, broker, auditor and carrier—are defined.

“Our point was to create a level playing field for industry to discuss and compare their cloud offering with the U.S. government,” the NIST Reference Architecture Working Group Co-convener Robert Bohn said. “The publication is also an opportunity for industry to map their reference architecture to the one NIST developed with input from all sectors,” he added. The publication can be found at http://collaborate.nist.gov/twiki-cloud-computing/pub/CloudComputing/ReferenceArchitectureTaxonomy/NIST_SP_500-292_-_090611.pdf.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST Seeks Comments to Help Build Public Safety Communications Network

The National Institute of Standards and Technology (NIST) is seeking advice on possible key features of a new broadband communications network for the nation's emergency services agencies. The network will use a portion of the 700 megahertz (MHz) radio frequency spectrum.

In a Sept. 12, 2011, Federal Register notice, NIST proposed four characteristics critical to the success of the future network—resiliency, reliability and availability, security, and affordability and compatibility with commercial systems—and asked for comments, suggestions and other input to help realize them. Among other things, NIST seeks to understand the extent to which these features and requirements can be satisfied, either with existing technology or with technology that could become available in the relatively near future.

This request for information coincides with the ongoing development of a demonstration testbed of the network by the joint NIST-National Telecommunications and Information Administration (NTIA) Public Safety Communications Research (PSCR) program. The testbed will provide a common site for manufacturers, carriers and public safety agencies to evaluate advanced broadband communications equipment and software tailored specifically to the needs of emergency first responders.

Comments are requested by 5 p.m. Eastern Daylight Time on Oct. 12, 2011, to Dereck Orr at dereck.orr@nist.gov. The complete Federal Register notice, detailing the four key features and the specific traits desired for achieving each one, may be retrieved from www.gpoaccess.gov/fr/retrieve.html. Select "2011 Federal Register, Vol. 76" and then enter "56165" in the search box.

The PSCR program is a partnership of the NIST Law Enforcement Standards Office and NTIA's Institute for Telecommunication Sciences. PSCR provides objective technical support—research, development, testing and evaluation—in order to foster nationwide public safety communications interoperability. More information is available on the PSCR web site at www.pscr.gov.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

Four New Reports Update Security Content Automation Protocol

Bringing order and security to the patchwork quilt of computing environments in a large organization can be a daunting task. Software tools and technical specifications that allow security information to be shared between information systems—the Security Content Automation Protocol (SCAP)—can save time and improve security. The National Institute of Standards and Technology (NIST) recently released four new publications that detail specifications to be used by the latest version of SCAP.

“A primary goal of automated security in a large organization’s computer environment is to make sure everything is configured securely as required by management, and that all patches are applied to eliminate known vulnerabilities,” said computer scientist David Waltermire. SCAP-enabled tools can scan computer systems to reveal software vulnerabilities and security configuration problems to be corrected.

SCAP relies on a fundamental component called Common Platform Enumeration (CPE), which is a standardized method of describing and identifying classes of applications, operating systems and hardware devices in an organization’s computer systems. A new version of CPE has been released—version 2.3—and the four new NIST Interagency Reports (NISTIRs) provide specifications for this version, which will be used with the new SCAP version.

For SCAP to work, CPE needs to have a unique name to identify all of the same types of products. For example, without CPE, different terms, such as “Windows XP” and “Win XP,” typically are used to refer to a single type of product, which can cause confusion and waste resources. CPE provides a single standardized unique name that covers all of these variants. NISTIR 7695 defines and explains the naming specification for CPE version 2.3.

Once a unique name is defined, CPE needs to compare names to determine whether they refer to some or all of the same products or platforms. For example, a product may have a unique name, but as in the Windows XP example, there may be subsets such as “Service Pack 1” or “Service Pack 2” that may further distinguish types of products. NISTIR 7696 provides the CPE name matching specification, which defines procedures for comparing two CPE names.

A dictionary specification for CPE is defined in NISTIR 7697, which includes the semantics of its data model and the rules associated with the CPE dictionary creation and management. NIST hosts the official CPE dictionary at http://nvd.nist.gov/cpe.cfm so organizations can search for and find identifier names.

With the naming, name matching and dictionary specifications defined, researchers moved to language specifications. NISTIR 7698 provides the applicability language specification, which allows construction of logical expressions built from CPE names. These expressions can be used by SCAP to identify more complex vulnerability and configuration situations, such as a problem that only exists when two applications are running together or an application is running on particular computing platforms. A real-life example is writing an applicability language expression that tells SCAP to search for situations in which Adobe Flash player version 10.3 or earlier is running on Mac OSX, Linux, Sun Solaris or Microsoft Windows.

A new publication announcing SCAP Version 1.2 is expected to be published soon. For more information on SCAP and other security automation projects, see scap.nist.gov.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

Seventh Annual IT Security Automation Conference Runs Oct. 31- Nov. 2

The Seventh Annual IT Security Automation Conference, co-hosted by the National Institute of Standards and Technology (NIST), will focus on the breadth and depth of principles and technologies designed to support computer security automation for organizations in both the public and private sectors. The conference will be held Oct.31 through Nov. 2, 2011, at the Crystal City Hyatt Regency Hotel in Crystal City, Va.

Government and industry are turning to IT security automation because it leverages computer standards and specifications to simplify key security tasks, including managing vulnerabilities, measuring security and ensuring compliance to rules and regulations. Automation frees resources to focus on other areas of the IT infrastructure.

The goal of the conference is to provide public- and private-sector executives, security managers and staff, IT professionals, and developers of products and services with a common understanding for using specific open standards and new security technologies. Topics include enabling interoperability among tools, automating risk mitigation measures, defining continuous monitoring, performing continuous monitoring in the cloud, automating integration of network security systems and using processes and tools to make practical risk-based decisions.

The Security Automation Conference will have six tracks: Continuous Monitoring, Software Assurance, Network Automation, IT Security Threats, Specification Updates and Vendor Products and Capabilities. The three-day event includes tutorials, conference proceedings, workshops and a vendor expo.

More information, including the agenda and registration, is available at http://scap.nist.gov/events/index.html.

NIST, the Department of Homeland Security (DHS), the National Security Agency (NSA) and the Defense Information Systems Agency (DISA) co-host this annual event. The four agencies are partnering to develop and implement automation of computer security.

A trade show runs concurrently with the conference and includes more than 30 vendors demonstrating available security automation packages. Reporters interested in attending should contact Evelyn Brown, (301) 975-5661.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

Eleven for '11: Number of Organizations Announced for 2011 Baldrige Award Site Visits

The Panel of Judges for the Malcolm Baldrige National Quality Award, the nation's highest recognition for organizational performance excellence, has selected 11 organizations for the final review stage for the 2011 award. Starting next month, teams of business, education, health care and nonprofit experts will make site visits to one organization in the small business category, one in education, six in health care and three in nonprofit.

The Baldrige Performance Excellence Program received 69 applications in 2011 (two manufacturers, three service companies, two small businesses, eight educational organizations, 40 health care organizations and 14 nonprofits/governmental organizations). The applicants were evaluated rigorously by an independent board of 553 examiners in seven areas: leadership; strategic planning; customer focus; measurement, analysis and knowledge management; workforce focus; operations focus; and results. Examiners will provide 300 to 1,000 hours of review to each applicant receiving a site visit, and all applicants will receive a detailed report on the organization's strengths and opportunities for improvement.

The 2011 Baldrige Award recipients are expected to be announced in late November 2011.

Named after Malcolm Baldrige, the 26th Secretary of Commerce, the Baldrige Award was established by Congress in 1987. The award—managed by the National Institute of Standards and Technology (NIST) in collaboration with the private sector—promotes excellence in organizational performance, recognizes the achievements and results of U.S. organizations, and publicizes successful performance strategies. The award is not given for specific products or services. Since 1988, 86 organizations have received Baldrige Awards.

Thousands of organizations use the Baldrige Criteria for Performance Excellence to guide their enterprises, improve performance and get sustainable results. This proven improvement and innovation framework offers organizations an integrated approach to key management areas.

For more information on the Baldrige Performance Excellence Program and the Baldrige Award, see www.nist.gov/baldrige.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

NIST to Host Workshop on Cryptography for New Technologies

The National Institute of Standards and Technology (NIST) will host a workshop on cryptography for new technologies from Nov. 7-8, 2011, at the agency’s Gaithersburg, Md., campus.

As the Internet evolves, it is becoming possible for objects other than typical computers to connect to it—such as DVD players that can link to video-streaming websites, for example. The Workshop on Cryptography for Emerging Technologies and Applications will provide an opportunity for attendees to learn about the challenges in creating security for these new situations, as well as NIST's current cryptographic programs and standards development. Subjects to be covered include sensor and building networks, mobile devices, smart objects/Internet of things, and cyber physical systems.

In preparation for the workshop, NIST is calling for abstracts that address these topic areas. The submission deadline is September 26, 2011, at 5 p.m. Eastern time. Abstracts should be sent to ceta-workshop@nist.gov, using the subject line: "CETA Workshop Abstract Submission."

Those interested in attending must register at https://www.fbcinc.com/nist_Crypt/atreg1.aspx. More information is available at http://www.nist.gov/itl/csd/ct/ceta-workshop.cfm.

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top