Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Redefining the SI Base Units

Note:

Much more recent information is available about redefinition of the SI units. For a comprehensive general overview, see How to Weigh Everything from Atoms to Apples Using the Revised SI (PDF).

For a detailed explanation of the redefinition of the kilogram, see Redefining the Kilogram.

For a unit-by-unit description of how the redefinition will affect NIST calibrations, see The Redefinition of the SI: Impact on Calibration Services at NIST (PDF).

Metrology is poised to undergo a profound change that will benefit scientists, engineers, industry and commerce – but which almost no one will notice in daily life.

The international General Conference on Weights and Measures (CGPM) has approved a plan to redefine four of the seven base units of the International System of Units (SI) in terms of fixed values of natural constants. The initiative would make possible new worldwide levels of consistency and accuracy, simplify and normalize the unit definitions, and liberate the system from dependence on the prototype kilogram, an artifact adopted in 1889 and still used as the world's physical standard for mass.

On Oct. 21, 2011, CGPM, the diplomatic body that has the authority under the Meter Convention to enact such a sweeping change, passed a resolution declaring that the kilogram, the ampere, the kelvin and the mole, "will be redefined in terms of invariants of nature; the new definitions will be based on fixed numerical values of the Planck constant (h), the elementary charge (e), the Boltzmann constant (k), and the Avogadro constant (NA), respectively."

That action follows – and results directly from – decades of pioneering metrology research around the globe, some of it accomplished by various groups at NIST and its antecedent, the National Bureau of Standards (NBS), that are now part of PML. And it echoes the recommendations made by three PML scientists and two European colleagues in an influential 2006 paper in Metrologia.

The change1 will not be implemented until the technical requirements for agreement and uncertainties are met. (The next scheduled meeting of the CGPM is in 2014.) In the interim, more work will be required: CGPM has called for further reductions in measurement uncertainty before the "New SI" can be implemented, and encouraged national metrology institutes (NMIs) and other institutions to "maintain their efforts towards the experimental determination of the fundamental constants h, e, k and NA."

International Prototype Kilogram
The International Prototype Kilogram is stored in an arrangement of three nested bell jars.
Credit: Photo courtesy of BIPM
"Continual improvement in the means for making precision measurements, namely the SI system, is the core of NIST's core mission," says Patrick Gallagher, Under Secretary of Commerce for Standards and Technology and NIST Director. "PML's fundamental metrology research helps ensure that the SI system changes with the times. This helps NIST anticipate U.S. industry's needs and ultimately helps create jobs by ensuring the nation can make the highest quality, most in-demand products."

The central philosophy of the impending redefinition is that instead of defining an SI unit per se, the CGPM will specify exact values value for a set of fundamental constants which will set the scale for the SI units. The values of those physical constants will reflect the most accurate determinations available from NIST, NMIs, and academic institutions at the time of implementation.

Three decades ago, the same schema was used for the first time to redefine the meter: In 1983, CGPM defined the meter by setting an exact fixed value of the speed of light in vacuum (299 792 458 m s-1), citing "the excellent agreement among the results of wavelength measurements on the radiations of lasers . . . ." 

"That is kind of a model for the way they're planning to redefine the other units now," says Peter Mohr of PML's Quantum Measurement Division, one of the authors of the 2006 Metrologia paper.

The meter redefinition relied heavily on a then-new method, devised by NBS researchers in Boulder, to measure the speed of light, and since then PML has been substantially involved in all aspects of the "New SI." In particular, PML groups have made major, sustained contributions to the impending redefinitions of the kilogram and the kelvin, the implementation of which will take place under the auspices of the International Bureau of Weights and Measures (BIPM), an intergovernmental organization under the authority of the CGPM.

The Kilogram

The kilogram – the only SI unit whose current definition is based on a manufactured object – is literally embodied in the original platinum-iridium prototype, the mass of which appears to have drifted over time in the range of a few parts in 108 in the course of the past 100 years. Much of that evidence comes from a device designed and made by a PML predecessor: Beginning in 1970, the BIPM conducted its mass comparisons using a balance called NBS-2. 

Richard Steiner adjusting the electronic kilogram
Physicist Richard Steiner adjusts the electronic kilogram, an experimental apparatus for defining mass in terms of the basic properties of nature.
Credit: Copyright Robert Rathe
In the new system, the kg will be defined by fixing the value of the Planck constant, h, at 6.626 06X x 10-34 joule second. (The final "X" represents one or more digits to be determined by results of future experiments.) That definition, succinctly proposed by Mohr and fellow PML scientist Barry Taylor in 1999, links h to mass based on two fundamental relationships: E = hv and E= mc2.

"It didn't just come out of the blue. It was germinating for quite a while because the idea of using a watt balance to monitor the kilogram had been around for a while," says Taylor. "Peter and I were returning from a talk on the watt balance in 1999 when the idea of using it to determine mass from the Planck constant rather than vice versa came up. That led to the paper."

Some experts have set a target relative uncertainty of 2 x 10-8 (two parts in a hundred million) for the kilogram. "That's a very high bar," says Mohr. But two approaches that are physically very different are getting close to that threshold.

One employs a device called a watt balance, which measures the force required to balance a 1 kg mass artifact against the pull of Earth's gravity by monitoring the voltage and current (hence the name "watt") involved in doing so. The first such device, pioneered at the United Kingdom's National Physical Laboratory (NPL) "was originally involved in the realization of the ampere," Taylor says. A more recent version of the balance was transferred last year to Canada's National Research Council (NRC). 

NIST PML has operated a vacuum-enclosed watt balance for years. In 2007, the PML team reported a relative uncertainty of 3.6 x 10-8. Other watt balance devices are either operating or under development in Canada, Switzerland, France, China, and at the BIPM. 

The other approach to determining h, pursued by a single large international collaboration, involves "counting" the number of atoms (via unit cell volume of a crystal) in each of two highly pure 1 kg single-crystal enriched silicon spheres about 94 mm in diameter. The result provides a determination of the Avogadro constant, which in turn can be used to obtain h using the well-known values of other constants. 

PML is not actively involved in Avogadro experiments at the present time2, although some of the best early work on Si crystals was done about 40 years ago by NBS researchers led by Richard Deslattes. Measurements of crystals "go back a really long time," Taylor says, "but Deslattes was the first person to plan out a truly full-scale attack on the Avogadro constant using silicon spheres, a combined X-ray and optical interferometer, and research on isotopic abundances," among other things.

Diagram of the watt balance.
Diagram of the NIST existing watt balance.
Taylor, Mohr and NIST watt-balance veteran Richard Steiner believe that, eventually, the watt balance is likely to become the standard device: "It seems that the watt balance is the more likely candidate," Mohr says, "because there are six independent experiments, whereas there's only one silicon experiment, albeit with two spheres."

Steiner, who worked for years on the NIST watt balance and was the first author on the 2007 paper, says "The CGPM resolution leaves the options open for defining mass from the Planck constant or the Avogadro constant. Whether the watt balance or Si spheres are used as the mass realization system will come down to practicality. Presumably, with the practical goal for the watt balances of measuring masses other than a few expensive platinum-iridium cylinders, it should have a greater potential usefulness as opposed to measuring even fewer and even more expensive silicon spheres.

"Both research areas need to be continued, but they are not equally practical in the long term for calibration dissemination. Even the watt balances have short-term (a few months) uncertainty limits that have not been fully characterized. Most published values rely upon long-term data of years, where many uncertainty contributions are considered to average out."

PML is actively undertaking two watt-balance initiatives: A top-to-bottom investigation of the existing NIST device in search of any sources of error that can be eliminated to improve its uncertainty figures, and construction of an entirely new watt balance in a dedicated, climate-controlled facility.

"The one that's out there now you can think of as a thoroughbred," says Jon Pratt, who heads the Fundamental Electrical Measurements group in the Quantum Measurements Division. "It's been bred for speed, it wants to get a value of h, and it wants to do it once. The next one we're building is much more of a Clydesdale. We want it to be much more of a workhorse, able to measure masses on a consistent basis." It will employ a 1 metric ton permanent magnet, in contrast to the superconducting magnet in the existing unit, and will take several years to complete.

Meanwhile, Pratt is heading the re-examination of the existing balance – a timely endeavor. "The NRC has a new number using the watt balance they purchased from NPL that is very close to the latest value from the international Avogadro project," Pratt says. "That certainly doesn't mean that they're both definitely right. But it's a good opportunity for us to take a new look at the NIST apparatus. There's a difference of about 250 parts per billion between the NRC balance and ours that we want to work with the NRC to understand. They appear to have made significant strides in tightening up the uncertainty associated with operating that particular balance, and if both our estimated uncertainties are to be believed, then differences exist between our balances that are yet to be properly accounted for.  

Mike Moldover with the spherical acoustic chamber.
PML scientist Mike Moldover in 1988 with the spherical acoustic chamber.
"Both the NRC and NIST systems have a good 20 years of people scratching their heads about what kinds of systematic mistakes they could be making. The NIST figures are quite consistent, and we've been beating the error bars down over time. Nonetheless, we're doing a top-to-bottom analysis of the experiment, to see if something new crops up. If there are problems, it's most likely that we will see a compounding of multiple small things. And the same is true of the NRC balance."

The next run on the PML balance is scheduled for February 2012, with more to follow. But even if the analysis uncovers a number of ways to reduce relative uncertainty, the chances of getting to the CGPM's preferred figure of 2 x 10-8 are "slim," Pratt says. "There's a limit to what's achievable. We might be able to coax some more out of it, but I doubt it." On the other hand, CGPM may decide that its current target is too stringent. "Our hope," Taylor says, "is that when push comes to shove, three or four in 108 will be acceptable."

[Click here for a recent IEEE report on the kilogram redefinition.]

The Kelvin

The SI base unit for thermodynamic temperature is the kelvin. It was first defined in modern form in 1954, and restated in 1967 as "the fraction 1/273.16 of the thermodynamic temperature of the triple point of water." That would seem to make it fairly easy to realize. But in fact, that temperature depends critically on the relative proportions of isotopes of hydrogen and oxygen in the water (among other factors). So in 2005, the definition was amended to include a recipe:

"This definition refers to water having the isotopic composition defined exactly by the following amount of substance ratios: 0.000 155 76 mole of 2H per mole of 1H, 0.000 379 9 mole of 17O per mole of 16O, and 0.002 005 2 mole of 18O per mole of 16O." That formula, known somewhat confusingly as Vienna Standard Mean Ocean Water even though it contains no salt or other solutes or impurities, is not easy to reproduce in the laboratory.

But even with the right water, the definition will only yield the temperature at the triple point – a temperature at which most of the sophisticated "primary" thermometers employed by NMIs cannot be used. Those devices rely on the more difficult measurement of fundamental thermodynamic quantities, and particularly the average energy in a mechanical degree of freedom. The Boltzmann constant links that energy to temperature.

In the "New SI," the kelvin will be defined specifically in terms of the Boltzmann constant, k (sometimes written as kB), which is known to a very high degree of accuracy – thanks in large measure to PML scientist Mike Moldover and colleagues, who set the standard for precision measurement over 20 years ago using an acoustic method.

Acoustic chamber and diagram of its construction.
The acoustic chamber and a diagram of its construction.
"The faster the particles go, the faster sound moves in the gas," Moldover explains. "So in the 1980s, our group developed an approach to measuring the speed of sound which was more accurate than anyone had done before. We measured the speed of sound in argon gas near the temperature of the triple point of water. By combining the speed of sound with the average mass of the argon atoms, we deduced the average kinetic energy of the argon atoms. The combination of the kinetic energy and the temperature of the argon gave us the Boltzmann constant."

Moldover's original motivation was to check a British result. "The folks at one of our rival labs, NPL, determined k from an acoustic measurement of the gas constant in 1976," he recalls. "Their number was surprisingly large – well off from what people thought it was. 

"My boss said, 'I'm sure this is wrong. You guys think of a way to do a better job of measuring it.' So I set about using acoustic resonators. As it happened, very soon after the NPL results were announced, two Oxford scientists pointed out that the NPL people had made an error in extrapolating to zero pressure. The NPL researchers corrected their error, and their number came back to what people had expected. But we kept going anyway, because we thought we could do five times better. And we did."

It ended up taking 12 years. When the group's figures were published in 1988, the paper filled 60 pages in the Journal of Research of the National Bureau of Standards, and specified k to a relative uncertainty of 1.8 x 10-6.

"His result has stood the test of time," says Taylor. "It was not until last year when somebody obtained a value with an uncertainty they claim is smaller than his. Mike's work was a real tour de force."

But it required overcoming host of problems. The spherical acoustical chamber held about three liters of gas, the isotopic composition of which had to be known exactly. To determine the volume, the team weighed the mercury required to fill it, using the density of mercury measured by A.H. Cook in 1957 and 1961. The temperature had to be determined to within 0.25 mK. "And we had to learn a lot of little things that affect the results in small ways," Moldover says. "The way you measure a resonant frequency is very tricky. Some heat flows from the gas into the wall, so you have to account for that. Also walls recoil a little bit, and that changes the resonant frequency, so you have to know precisely how elastic the metal is."

The effort continued past 1988, as Moldover applied his method to the International Temperature Scale published in 1990 (known as ITS-90), which lists key temperature points for various substances. "We measured the thermodynamic temperature for these different points and got values for the difference between ITS-90 and the thermodynamic temperature that were about five times more accurate than anyone had gotten before. They have since been reproduced elsewhere."

Today, PML remains intensely involved in a range of experiments on k. "NIST is one of the NMIs that have determined the Boltzmann constant using two different methods – acoustic gas thermometry and Johnson noise thermometry," says Greg Strouse, who leads PML's Temperature and Humidity Group.

"But unique to NIST, PML's Sensor Science Division (SSD) intends to determine the Boltzmann constant with a third method – spectral-radiance thermometry – to further our confidence in the assigned value and ensure that the impact of the re-definition on industry is small," Strouse says. "The NIST dissemination of both the thermodynamic temperature scale and ITS-90 expands the role of NIST, enhances the link between contact and non-contact thermometry, and allows for improved temperature measurement at both high and low temperatures. 

"While the dual dissemination of temperature will not impact most industries, the dissemination of thermodynamic temperature will improve the assigning of values to material thermo-physical properties. By advancing detector-based, Johnson noise, and acoustic thermometer technology, PML will improve the uncertainty in the realization and dissemination of the thermodynamic temperature scale. The use of thermodynamic thermometers will find application in high-temperature environments (e.g., metal refractory), hazardous environments (e.g., nuclear), and space monitoring of Earth's climate. Eventually, the use of the ITS-90 defining standards will be gradually replaced by thermodynamic temperature defining standards. NIST PML SSD is positioning itself for this change by developing thermodynamic thermometers for standards and industrial use."

Notes:

1 In February 2012, NIST/PML and NRC Canada will sponsor a special session on SI redefinition at the annual meeting of the American Association for the Advancement of Science.

2 Other NIST divisions have related research programs.

Released November 2, 2011, Updated August 6, 2019