Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Second: The Past

Archaeologists believe that ancient peoples first measured time by tracking the phases of the moon. Starting around 20,000 years ago, they recorded these moon phases on cave walls as well as on stone, antler and bone. Scientists speculated that early humans’ interest in time likely served both practical and superstitious purposes.

Approximately 5,000 years ago, Egyptians noticed that the rising of the star Sirius usually coincided with the flooding of the Nile, a vital annual event upon which their civilization depended for food. They built the first sundials to mark the passage of the day. They arranged the day into the “duodecimal system” of 12 hours of day and night we use today. 

The Egyptians later built the first water clocks. They measured time by regulating the flow of water out of a tank. Markings on the tank represented the hours and as the tank emptied, the time could be told by looking at the level of the water in the tank. Using the same design, the Chinese built a more accurate and reliable clock by using liquid mercury, not water, to avoid freezing. 


photo of a water clock in a museum
Ctesibius water clock, 3rd century BC, Alexandria (reconstruction) at the Thessaloniki Technology Museum
Credit: Creative Commons (CC BY-SA 4.0). User Gts-tg

The hourglass is a surprisingly recent invention; evidence suggests that it first appeared less than a thousand years ago. It operated according to the same basic principle as the water clock: It used gravity’s pull on a substance to mark the passage of time. The candle clock, another invention from the last thousand years, measured time based upon how quickly a “standard” candle of a given material and length would burn.   

The first truly mechanical clock that resembled today’s timekeeping devices was built in Europe in the 14th century. Early mechanical clocks were only accurate to about 15 minutes per day, about as accurate as a sundial could get when the sun was out. Mechanical clocks’ precision and accuracy increased steadily with the development of the pendulum-driven clock in the 16th century and the spring-driven clock in the 17th century.

As clocks improved, people started to divide time into smaller and smaller chunks. Around 1580, Jost Bürgi, a Swiss clockmaker, built a clock with a minute hand; early adopters of this design included astronomer Tycho Brahe. In 1680, British clockmaker Daniel Quare added a second hand to a pendulum clock.


Quare Clock
Bracket clock by Daniel Quare on display at the Walker Art Gallery, Liverpool
Credit: Creative Commons (CC BY 3.0). User: Racklever

Around this time, the Royal Society of London proposed a standard for timekeeping. It was actually a standard of length: the length of a pendulum that took a second to do a half-swing. But a problem quickly emerged: French scientists found that the proper length depended on the altitude of the clock’s geographical location, as gravity would affect the rate at which the pendulum swung.  

The 18th century saw the invention of the marine chronometer, which was the life’s work of English clockmaker John Harrison.


gold chronometer on blue background
John Harrison's H1 marine timepiece
Credit: Flickr User: Metadata Deluxe

For navigational purposes, the British Empire needed a clock that would be durable enough to withstand travel at sea while remaining accurate enough to calculate longitude, the distance traveled east to west. To accomplish this, they established the Longitude Act, which awarded money to enterprising clockmakers as they worked to improve and perfect their devices. Harrison took the lion’s share of these rewards during the 31 years that he worked on his chronometer, which kept accurate time to within 1/5 of a second per day. Other clockmakers in Britain and abroad built even more accurate versions than Harrison. 

Chronometers had a very specialized function: to help ships navigate. But they also drove innovation in timekeeping. Harrison’s first chronometer was a weighty metal construction that would be hard to fit into a box 4 feet (1.2 meters) on a side. But Harrison’s last chronometers could fit inside a pocket. 

In the late 18th century, France, after its revolution, proposed the “decimal second,” which would be 1/1,000th of a minute, and a millionth of a day. This concept was abandoned, but as Claude Audoin and Bernard Guidot point out in their 1998 book “The Measurement of Time,” one concept that lives on to this day is the idea of splitting a single second into decimal amounts—such as the hundredth-of-a-second  (0.01 s) increments in which Olympic swimming and track events are measured.

By the 19th century, timepieces of all types were becoming widespread. But the mechanical clocks of this era remained highly susceptible to errors from friction and changes in temperature. So famed physicist James Clerk Maxwell proposed a radical new idea, which was in turn endorsed by the eminent mathematician and physicist Lord Kelvin, the namesake of the official international unit of thermodynamic temperature. The two mused that the vibrations of atoms could serve as an invariant “natural standard.” Atoms, they explained, were identical to one another and would never change, so they would always “tick” with the same frequency and wouldn’t be susceptible to the same sorts of disturbances as mechanical clocks.

However, the technology needed to make a pendulum out of an atom wasn’t even close to being invented yet, so it would be some time before the mechanical clock was handed its hat.  

The National Bureau of Standards, now the National Institute of Standards and Technology (NIST), was established in 1901. At that time, the most advanced clock was the Riefler clock, whose pendulum was encased in a vacuum chamber to reduce friction.


an original Riefler clock
The National Bureau of Standards (NBS) purchased this Riefler clock in 1904 from the firm Clemens Riefler in Germany. The Riefler clock was a highly stable pendulum clock and it served as the national time interval standard until 1929, when it was replaced by the Shortt clock. The Riefler clock's pendulum, made of Invar, hangs from a thin elastic metal strap and receives an impulse every second by a rocking of the clamp which supports the upper end of the strap. The weight that supplies the energy for this rocking and for the counting movement is "rewound" every 30 seconds by an electromagnet driven by dry cells.
Credit: NIST

Accurate to within 1/100th of a second per day (or about 3.5 seconds per year), the Riefler clock was NIST’s primary time standard from 1904 until 1929, when it was replaced by the even more advanced pendulum-driven Shortt clock. The Shortt clock used two pendulums, one master and one slave. The slave pendulum of the Shortt clock drove the clock’s mechanics and was connected to the master pendulum electronically to free it as much as possible from any friction and disturbance. The Shortt clock was accurate to within about 1 second per year. Even though it was remarkably accurate for a mechanical clock, it was still subject to environmental disturbances such as vibration and had to be closely watched to ensure that it was operating optimally. 


A Shortt Electro-Mechanical Clock in a case
The Shortt clock was purchased by the National Bureau of Standards (NBS) in 1929 from the Synchronome Co. of London. Due to its precision of within about 1 millisecond per day, it replaced the Riefler Clock (1998.0515.001) at NBS, as well as in astronomical observatories. The Shortt clock was replaced in its turn within a few years by quartz crystal oscillators (similar to 1998.0373.001). The Shortt clock remained at NBS, where it served as a reference for the determination of G, the gravitational constant.
Credit: NIST

The Shortt clock was replaced a short time later with a quartz crystal oscillator-based clock, which had been invented in 1927. Quartz crystal oscillators themselves had been made into standard electrical circuits in the early 1920s to calibrate the frequency of radio transmissions and keep stations from interfering with one another. The quartz crystal oscillator and quartz clocks took advantage of the fact that quartz is a piezoelectric material. This means that the crystal flexes when an electrical current is run through it and, conversely, when flexed, it creates a small electric current. It does this at a very steady rate. In most quartz wristwatches, the piece of quartz vibrates 32,768 times a second and is accurate to about 15 seconds per month. Pretty good for something people wore on their wrists (though the phone in your pocket gets its time from a much more accurate source)!


Tow large metal boxes with a wood box in between, then space, then two more metal boxes with wood box inbetween and inside of that is the inside of a quartz time standard
The demand for time standards, accurate to a thousandth of a second or better for navigation, communication, seismology, geological surveys, and physical science, led to the development of the quartz crystal clock in 1929. As the first truly modern electronic clock, it substituted the natural resonance frequency of carefully cut bits of quartz for the swinging pendulums of earlier mechanical devices. Shown here is the heart of an electronic crystal clock, a carefully selected quartz crystal sealed in an evacuated container. The operation of a quartz clock is based on the piezoelectric property of quartz crystals. When electrically excited, the quartz crystal uses its piezoelectric properties to act as a resonator. The addition of an amplifier and feedback to the resonator produces a quartz crystal oscillator, or frequency standard. Somewhat simpler and much smaller quartz crystals serve in today’s quartz clocks and watches.
Credit: NIST

NIST’s quartz crystal time standards, however, were much better made than a quartz wristwatch. Properly insulated, they were unaffected by gravity, noise and outside vibration, making them much more robust than a pendulum-driven clock. They were accurate to about 3 seconds per year, a little worse than a Shortt clock, but their reliability and reduced need for maintenance made them appealing for use as a national standard. The quartz clock remained the U.S. time interval standard until the 1960s.

In 1960, the second was still understood the same way it was for centuries: It was a fraction of an Earth day, the time that it takes our planet to make a complete rotation. Put another way, a second was 1/86,400th of an Earth day. 

Yet scientists increasingly realized that this definition was inaccurate: slight variations in the Earth’s rotational speed from day to day made this an unsatisfactory standard. So in 1960, the General Conference on Weights and Measures (CGPM) approved a new definition of the second based on the yearly orbit of the Earth around the Sun. This new definition, depending upon a celestial orbit (also known as an astronomical ephemeris), was termed the “ephemeris second.” Much more accurate than the previous definition, it was also extremely unwieldy: The second would now be defined as 1/31,556,925.9747 of a tropical year (the time between two summer solstices) for 1900. Just a few years later, the CGPM would declare the ephemeris second to be “inadequate for the present needs of metrology.”

But around this time, a new clock burst on the scene and changed timekeeping forever. It finally promised to realize Maxwell’s vision of a clock based on an unvarying natural pendulum.

This new clock began to take shape back around Christmas of 1949 in the laboratory of NIST’s Harold Lyons. Lyons’ group bombarded a cloud of ammonia molecules trapped in a 30-foot-long cell with microwaves. As they scanned across the microwave frequencies, they hit upon one that caused the molecules to give off a maximum amount of its own microwaves. The team used a device that counted the number of microwave peaks per second—its frequency—and manually transferred that frequency to a quartz crystal oscillator. This process is akin to how you might tune a guitar: You take the tone you are given and match the string you want to tune to that. With this method, the team was able to define the second as the time it took the ammonia to emit 23.870 billion cycles of radiation.


Condon and Lyons with Atomic Clock and Ammonia Molecule Model
NIST Director Edward Condon (left) and clock inventor Harold Lyons contemplate the ammonia molecule upon which the clock was based.

Lyons’ 1949 clock was crude by modern standards: Though ammonia can broadcast a signal to mark a second precisely, the technology for measuring and transferring that signal was still in the early stages. In the end, the ammonia clock was no more accurate or precise than keeping time by measuring the rotation of the Earth. Still, it proved the concept of an atomic clock. 

In 1955, Louis Essen at the National Physical Laboratory in the U.K. built upon Nobel Prize-winning physicist Isidor Rabi’s idea to use a beam of cesium atoms (as opposed to a diffuse cloud of ammonia molecules) and constructed the first atomic clock that was stable enough to be used as a time interval standard. 

NIST started to build a cesium atomic clock in 1952 but did not finish it until 1958. Called NBS-1, it became the U.S. national standard of frequency in 1959.


Man standing behind a big horizontal cylinder of metal and a lot of wires
Atomic clock NBS-1
Credit: NIST

What is a “standard of frequency” and how is it related to timekeeping? The first thing to note is that atoms don’t tell time. Instead, they absorb and release radiation such as microwaves with a well-defined frequency, the number of wave cycles per second. In an atomic clock, a particular frequency of microwaves from cesium—9,192,631,770 cycles of radiation per second—is converted into a time interval. In other words, scientists define a second as the time it takes to count 9,192,631,770 cycles of this radiation.

And despite what headlines seem to suggest, atomic clocks don’t run forever once they are turned on. They run very accurately for limited periods of time, before they need to be calibrated or adjusted. Nowadays, hundreds of atomic clocks in dozens of places around the world operate simultaneously to provide signals used for timekeeping. But to get a sense of how accurate they are, it’s often helpful to explain how long it would take an atomic clock to gain or lose a second if it ran continuously.

The NBS-1 had an accuracy of 15 parts per trillion, meaning the clock wouldn’t gain or lose a second if it ran continuously for 3,000 years. Put another way, it gained or lost no more than 1/3,000th of a second every year. The NBS-1 was followed by more accurate clocks, NBS-2 and NBS-3, which became the U.S. national standard of frequency by the end of the 1960s.

With the advent of these super-accurate atomic clocks, the definition of the second was ready to change. In 1967, the International System of Units (SI) redefined the second as the time it took the cesium-133 atom to release 9,192,631,770 cycles of microwave radiation when making its “hyperfine energy transition.” 

We won’t go into the technical specifics, but let’s break down this phrase a bit. The “energy transition” refers to the little up-and-down jumps in energy made by the outermost electron orbiting the cesium atom’s core, or nucleus. “Hyperfine” reflects the slight alterations in the atom’s overall energy levels when the electron interacts magnetically with the nucleus. When the cesium atom absorbs just the right amount of microwave energy, it makes the hyperfine energy transition: The atom enters a higher energy level and the electron flips its “spin,” which can be imagined as a bar magnet piercing through the atom that flips upside down. When the atom jumps back down to its lower-energy state, it releases microwaves with the precise frequency of 9,192,631,770 cycles per second, which is used as a timekeeping standard.

For almost 50 years, atomic clocks were “hot” clocks. That is, they read the microwave transition frequencies of cesium atoms at room temperature. Room temperature may seem merely “warm” to us, but it’s plenty hot for a gas of atoms. At this temperature, cesium atoms move several hundred meters per second. This limits the clock’s accuracy, as the atoms become a little too “blurry” to measure. More precisely, the cesium atoms’ motion slightly increases or reduces the frequency of microwave radiation that each atom needs in order to make its energy transition. Even though these first clocks were “hot,” their accuracy and precision steadily improved as scientists sought to root out and minimize sources of disruption and error such as temperature variations within the clock and unwanted hyperfine energy transitions. 

By 1975, NIST’s NBS-6 atomic clock was accurate and stable enough so that it would neither gain nor lose a second in 400,000 years.


Long metal cylinder on a table with a man standing at the far end.
Physicist David Glaze with atomic clock NBS-6
Credit: NIST

Launched two decades later, in 1993, NIST’s NIST-7 atomic clock was significantly more accurate and wouldn’t gain or lose a second in 6 million years. Put in other terms, the clock would neither gain nor lose more than 1/6,000,000th of a second per year, or approximately 2 nanoseconds (billionths of a second) per day. 


Three men stand behind a glass tube with equipment inside
Atomic frequency standard NIST-7 and its creators (left to right) John P. Lowe, project leader Robert E. Drullinger, and David J. Glaze
Credit: NIST

You don’t need this kind of precision for your watch. But being able to measure such tiny fractions of a second opened up all sorts of technological applications that would have been impossible just decades earlier. 

Created April 9, 2019, Updated April 26, 2021