In This Issue...
Today, Pause for a Second
For the great majority of human history, the earth's rotation—the apparent motion of the sun and stars across the sky—kept time far more precisely than any human-made clock. But the invention of super-accurate atomic clocks more than 60 years ago demonstrated that the earth's rotation rate is, in fact, changing by tiny amounts every day.
The Earth is supposed to make one full rotation every 86,400 seconds, but a range of complex issues change the rotation rate by a microsecond or so each day. Sometimes it speeds up, sometimes it slows down, but on average, the Earth's rotation is slowing little by little. So, in order to keep Coordinated Universal Time (UTC), measured by atomic clocks, in line with astronomical time, leap seconds are occasionally needed.
On June 30, 2015, at 23:59:59 UTC (just before 8 p.m. Eastern Daylight Time) the world will add its 26th leap second since 1972. This is the first leap second since 1997 to be added during normal business hours in the U.S., particularly on the West Coast. Because there are many systems that rely on precision timing synchronized to UTC, several U.S. government agencies, including NIST, have provided a best practices guide to help people respond to the upcoming leap second.
UTC, official international time, is a hybrid of timing based on atomic clocks to provide the precise "ticking rate" and astronomical time based on the position of the sun and stars. Because atomic clocks are much more stable than the earth's rotation, atomic time and astronomical time would very slowly diverge, until in the very distant future atomic clocks would read "midnight" while the sun was directly overhead. Leap seconds are added when necessary to keep atomic time and astronomical time synchronized to better than one second.
Because the Earth's rotation speeds up and slows down in an unpredictable manner (the only thing we do know is that the rotation rate will slow down over the long term because of gravitational forces from the moon and sun), it is not possible for scientists to know exactly when to add leap seconds more than a few months ahead of time. While this gives people some time to prepare, adding a leap second to systems tied to atomic time can be problematic. In particular, adding leap seconds can cause interruptions or failures in data logging, telecommunications, and time distribution systems. During the most recent leap second at the end of December 31, 2012, sections of the Internet were interrupted for a few hours. Since then, applications requiring precision timing have become even more widespread, raising concerns that if the leap second is not properly implemented on June 30, it could have broader impacts.
The international community is discussing a range of options for the future of global timing, from keeping the current system of leap seconds, to entirely eliminating leap seconds, to combining leap seconds into much less frequent leap hours. The Radio communication Sector of the International Telecommunication Union (ITU-R), representing nearly all the world's nations will make the final decisions in the future.
Media Contact: Laura Ost, firstname.lastname@example.org, 303-497-4880
NIST ‘How-To’ Website Documents Procedures for Nano-EHS Research and Testing
As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.
To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.
In common lab parlance, a "protocol" is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a "methods" section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST's new Protocols for Nano-EHS website, the protocols are extraordinarily detailed. For ease of citation, they're published individually—each with its own unique digital object identifier (DOI).
The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. "Often, if you do something seemingly trivial—use a different size pipette, for example—you get a different result. Our goal is to help people get data they can reproduce, data they can trust."
A typical caution, for example, notes that if you're using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it's important also to measure the transmission spectrum of the particles if they're colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.
"These measurements are difficult because of the small size involved," explains Kaiser. "Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials."
"For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle's dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that."
The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.
NIST's nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at email@example.com.
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
Ultra-stable JILA Microscopy Technique Tracks Tiny Objects for Hours
JILA researchers have designed a microscope instrument so stable that it can accurately measure the 3D movement of individual molecules over many hours—hundreds of times longer than the current limit measured in seconds.*
The technology was designed to track the machinery of biological cells, down to the tiniest bits of DNA, a single "base pair" of nucleotides among the 3 billion of these chemical units in human genes. But the instrument could be useful well beyond biology, biochemistry and biophysics, perhaps in manufacturing.
JILA is a partnership of the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.
"This technology can actively stabilize two items relative to each other with a precision well below one nanometer at room temperature," JILA/NIST physicist Tom Perkins says. "This level of 3D stability may start to interest the nanomanufacturing world, when they look at making and characterizing things on the single-nanometer scale."
The work builds on JILA's world-leading expertise in measuring positions of microscopic objects. The latest tweaks extend stability for a much longer time period, many hours at a time. With the longer observation times, researchers can see more successive steps of molecular motors, for instance. These biochemical processes are responsible for a broad range of movement in living organisms, including moving molecules around the interior of a cell or copying DNA into another form of genetic material, RNA. The new JILA instrument also can aid in measuring individual proteins as they fold into specific positions, a process required for them to work properly.
Until now, researchers had difficulty detecting more than a few individual, one-base-pair steps in succession before instrumental "drift" would blur the signal. Observing such sets of repetitive steps is very rare. The instrument must be stable to within about one-tenth of a nanometer (1 angstrom to biologists, equivalent to the diameter of a hydrogen atom).
Typically, a microscope can only occasionally achieve this level of stability. But when augmented by the new JILA measurement platform, it can reliably achieve tenth of a nanometer stability for up to 100 seconds at a time. And it can do this over and over again for extended periods—the JILA team operated the system for up to 28 hours straight.
In addition to its high precision and stability, the instrument can detect motion over a wide range of time scales, critical for calibrating instruments and measuring short-lived states in protein folding. The JILA method can be applied to optical trapping techniques, atomic force microscopes and super-resolution imaging.
The method uses two lasers to measure the positions of opposite ends of a molecule, or two different objects, based on the intensity of scattered light. The scattered light is detected by a common photodiode, and the signals are digitized, analyzed and used to calculate the positions of the samples. Crucially, the JILA team verified the stability of the technique by using the two lasers to make two separate, independent measurements of a single sample. Without this confirmation, researchers can't determine if it is the sample or the lasers moving, Perkins explains.
"This technology excites me because it opens the door to measuring the tiniest protein motions," Perkins says."
The research was supported by the National Science Foundation and NIST.
* R. Walder, D.H. Paik, M.S. Bull, C. Sauer and T.T.Perkins. Ultrastable measurement platform: sub-nm drift over hours in 3D at room temperature. Optics Express. Vol. 23, Issue 13, 2015. pp. 16554-16564. DOI: 10.1364/OE.23.016554.
Media Contact: Laura Ost, email@example.com, 303-497-4880
NIST Group Maps Distribution of Carbon Nanotubes in Composite Materials
Despite their small size and simple structure, carbon nanotubes—essentially sheets of graphene rolled up into straws—have all sorts of potentially useful properties. Still, while their promise looms large, how to fully realize that promise has proven to be something of a mystery.
In an effort to strip away some of that mystery, researchers from the National Institute of Standards and Technology (NIST), the Massachusetts Institute of Technology and the University of Maryland have developed cutting-edge image gathering and processing techniques to map the nanoscale structure of carbon nanotubes inside a composite material in 3-D. Exactly how the nanotubes are distributed and arranged within the material plays an important role in its overall properties. The new data will help researchers studying composite materials to build and test realistic computer models of materials with a wide array of thermal, electrical, and mechanical features.
Their research was featured in ACS Nano.
Carbon fiber composites are typically prized for their high strength and low weight, and carbon nanotube (CNT) composites (or nanocomposites), which have more and smaller carbon filaments, show promise for high strength as well as other properties such as the ability to conduct heat and electricity.
However, according to NIST's Alex Liddle, an author on the study, while researchers previously could reliably measure a nanocomposite's bulk properties, they didn't know exactly why various formulations of the composite had different properties.
"Figuring out why these materials have the properties they do requires a detailed, quantitative understanding of their complex 3-D structure," says Liddle. "We need to know not only the concentration of nanotubes but also their shape and position, and relate that to the properties of the material."
Seeing the arrangement of carbon nanotubes in a composite material is tough, though, because they're surrounded by an epoxy resin which also is mostly carbon atoms. Even with sophisticated probes the contrast is too low for software image processors to pick them out easily.
In such research situations, you turn to graduate students and postdocs like NIST's Bharath Natarajan, because humans generally make great image processors. But marking thousands of carbon nanotubes in an image is mighty boring, so Natarajan designed an image-processing algorithm that can distinguish CNTs from an epoxy resin as well as he can. It paid off.
According to Liddle, a CNT expresses its full potential in strength and thermal and electrical conductivity when it is stretched out and straight, but …
"When CNTs are suspended in an epoxy resin, they spread out, bundle and twist into different shapes," Liddle says. "Our analysis revealed that the benefits of CNTs increases in a non-linear fashion as their concentration increases. As the concentration raises, the CNTs come into contact, increasing the number of intersections, which increases their electrical and thermal conductivity, and the physical contact causes them to conform to one another, which straightens them, increasing the material's strength."
The fact that increasing the concentration of CNTs enhances properties is not particularly surprising, but now researchers know how this affects the materials' properties and why earlier models of nanocomposite materials' performance never quite matched how they performed in practice.
"We've really only seen the tip of the iceberg with respect to this class of material," says Liddle. "There are all sort of ways other researchers might slice and dice the data to model and eventually manufacture optimal materials for thermal management, mechanical reinforcement, energy storage, drug transport and other uses."
This work was performed in part at NIST's Center for Nanoscale Science and Technology (CNST), a national user facility available to researchers from industry, academia and government.
* B. Natarajan, N. Lachman, T. Lam, D. Jacobs, C. Long, M.Zhao, B. Wardle, R. Sharma and J. Liddle. The evolution of carbon nanotube network structure in unidirectional nanocomposites resolved by quantitative electron tomography. ACS Nano. DOI: 10.1021/acsnano.5b01044.Published online June 1, 2015.
Media Contact: Mark Esser, firstname.lastname@example.org, 301-975-8735
NIST Revises Key Computer Security Publication on Random Number GenerationIn response to public concerns about cryptographic security, the National Institute of Standards and Technology (NIST) has formally revised its recommended methods for generating random numbers, a crucial element in protecting private messages and other types of electronic data. The action implements changes to the methods that were proposed by NIST last year in a draft document issued for public comment.
The updated document, Recommendation for Random Number Generation Using Deterministic Random Bit Generators, describes algorithms that can be used to reliably generate random numbers, a key step in data encryption.
One of the most significant changes to the document is the removal of the Dual_EC_DRBG algorithm, often referred to conversationally as the “Dual Elliptic Curve random number generator.” This algorithm has spawned controversy because of concerns that it might contain a weakness that attackers could exploit to predict the outcome of random number generation. NIST continues to recommend the other three algorithms that were included in the previous version of the Recommendation document, which was released in early 2012.
The revised version also contains several other notable changes. One concerns the CTR_DRBG—one of the three remaining random number algorithms—and allows additional options for its use. Another change recommends reintroducing randomness into deterministic algorithms as often as it is practical, because refreshing them provides additional protection against attack. The document also includes a link to examples that can help developers to implement the SP 800-90A random number generators correctly.
The revised publication reflects public comments received on a draft version, released late last year. Recommendation for Random Number Generation Using Deterministic Random Bit Generators (NIST Special Publication 800-90A Rev. 1) is available on NIST’s website: www.nist.gov/manuscript-publication-search.cfm?pub_id=918489.
Media Contact: Chad Boutin, email@example.com, (301) 975-4261
NIST Publishes Final Guidelines for Protecting Sensitive Government Information Held by Contractors
The National Institute of Standards and Technology (NIST) has published the final version of its guidance for federal agencies to ensure that sensitive federal information remains confidential when stored in nonfederal information systems and organizations.
Contractors routinely process, store and transmit sensitive federal information to assist federal agencies in carrying out their core missions and business operations. Federal information is also shared with state and local governments, universities and independent research organizations.
To keep this information secure, Executive Order 13556 established the Controlled Unclassified Information (CUI) Program to standardize the way the executive branch handles unclassified information that requires protection, such as personally identifiable information. The National Archives and Records Administration (NARA)administers the program. Information that qualifies as "controlled unclassified information" is defined by NARA in the CUI Registry, an extensive list of executive branch information that requires controls based on laws, regulations or government-wide policies.
To develop guidelines for protecting this information, NARA worked with NIST, the government's source for computer security standards and guidelines.
The two organizations jointly drafted guidelines for protecting CUI on information systems outside the immediate control of the federal government and published them for public comment last fall.
The new document, Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations (NIST Special Publication 800-171), is the final version of those guidelines.
The publication provides federal agencies with recommended requirements to protect the confidentiality of CUI residing in nonfederal systems and organizations consistent with law, regulation or government-wide policy.
The new guidelines are designed for federal employees with responsibilities for information systems development, acquisition, management and protection. The requirements apply to all components of nonfederal information systems and organizations that process, store or transmit CUI, or provide security protection for those components.
The guidelines are drawn from existing computer security requirements for federal information systems found in two of NIST's foundational information security documents: Federal Information Processing Standard (FIPS)200 and the Security and Privacy Controls for Federal Information Systems and Organizations (NIST SP 800-53).
"NIST SP 800-171 is critical to our strategy to strengthen needed protections for CUI," says John Fitzpatrick, director of NARA's Information Security Oversight Office. "Together with NARA's recently-proposed CUI regulation and a planned Federal Acquisition Regulation clause, we will bring clarity and consistency to the handling of CUI across government."
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
NIST Taps Experts in Business Continuity, Network Disaster Recovery for Community Resilience Efforts
The National Institute of Standards and Technology (NIST) has named experts in business continuity planning and the post-disaster recovery of telecommunication networks to serve as NIST Disaster Resilience Fellows.
George B. Huff Jr., founder and director of The Continuity Project, Alexandria, Va., and Steve Poupos, AT&T’s director of technology operations, will assist NIST as it finalizes its Community Resilience Planning Guide for Buildings and Infrastructure. They also will contribute to follow-on efforts to support U.S. counties, cities and towns in implementing the guide.
Issued in April, 2015, as a draft for public review, the planning guide lays out a flexible approach that communities can adapt and use to set priorities, allocate resources, and take actions that will help them to withstand and bounce back from the shocks and stresses of extreme weather and other hazards. NIST plans to issue the initial version in September, 2105. The guide will be updated periodically.
Huff and Poupos join nine NIST Disaster Resilience Fellows named in October 2014. The two are recognized leaders in their fields and will complement the knowledge and skills of NIST researchers as they carry out a multifaceted program to assist communities and public and private stakeholders in cost-effectively strengthening local community resilience.
Poupos has extensive experience with communications networks, business continuity, disaster recovery and emergency management. He has led AT&T’s Network Disaster Recovery and Emergency Management Organization during recovery from Hurricanes Sandy and Irene and many other hazard events.
As the head of The Continuity Project, Huff advises businesses and other organizations on emergency preparedness and response, business continuity management, and information technology security. Previously, he served as attorney-advisor to the Office of Security and Facilities/Judiciary Emergency Preparedness Office of the Administrative Office of the U.S. Courts. Huff participates in developing business continuity standards in the International Organization of Standards (ISO) and other bodies.
Media Contact: Mark Bello, email@example.com, 301-975-3776