Frequently asked questions (FAQ)
How clocks work. Cesium clocks, why we need precise time and frequency. Stopwatch calibrations, calendars and history.
How do clocks work?
Clocks work by counting a periodic event with a known frequency. For example, the pendulum in a grandfather clock swings back and forth at the same rate, over and over, and the gears "count" the swings. The arm of the pendulum is adjusted in order to make each half-swing take one second. One cycle per second equals 1 Hertz (Hz).
With a quartz clock (like most wristwatches), a piece of quartz crystal is cut and used in an electronic circuit where it vibrates at a certain frequency (usually 32,768 Hz). The frequency is "counted" by dividing it by 32,768 to equal one second 'ticks'. The ticks are used to advance the seconds on the clock. By dividing a high frequency down to a low frequency, the accuracy can be increased. For instance, if a half-swing of a pendulum is actually 0.1 Hz off, then the grandfather clock will be off by one second in ten seconds. If the quartz frequency on a watch is 0.1 Hz off, the watch will be off by one second in 327,680 seconds or, roughly, 0.26 seconds per day.
With an atomic clock, there is a natural tendency of atoms to change energy levels when they are exposed to very specific ("resonant") frequencies. If the correct frequency can be generated to make the atoms change, then that frequency can be counted or divided down and compared. In this case, the generated frequency is the 'tick'. The benefits of an atomic clock are that the resonant frequencies are natural properties (not human-made) and that they are very high frequencies, in the billions of Hertz. If an atomic clock was off by 1 Hz and the frequency was 1 GHz (1 billion Hz), then it would be off by one second in 31.7 years or, roughly, 86 microseconds (0.000086 s) per day.
The best cesium oscillators (such as NIST-F1) can produce frequency with an uncertainty of about 3 x 10-16, which translates to a time error of about 0.03 nanoseconds per day, or about one second in 100 million years.
Why are Cesium atomic clocks used?
Since 1967, the International System of Units (SI) has defined the second as the period equal to 9,192,631,770 cycles of the radiation, which corresponds to the transition between two energy levels of the ground state of the Cesium-133 atom. This definition makes the cesium oscillator (sometimes referred to generically as an atomic clock) the primary standard for time and frequency measurements. Other physical quantities, like the volt and meter, also rely on the definition of the second as part of their own definitions.
Why must time and frequency be measured so precisely?
Precise time and frequency synchronization have many uses in everyday life. Synchronization between two or more locations is necessary for high-speed communication systems, banking and stock transactions and transmitting everything from e-mail to sonar signals in a submarine. Power companies use precise time to regulate power system grids and reduce power losses. Radio and television stations require both precise time-of-day and frequency in order to broadcast programs. Mobile phone base stations must have stable and accurate oscillators in order to handle the massive amount of data being transmitted and received.
Precise time measurements are also essential for accurate navigation and the support of communications on Earth and in space. Scientific organizations such as NASA depend on reliable and consistent time measurement for projects such as interplanetary space travel and transmissions. Fractional disparities in times between a space probe and ground-based tracking stations can dramatically affect the control and position of spacecraft. Precise time measurements are also essential to radio navigation systems like the Global Positioning System (GPS). The atomic clocks onboard the GPS satellites are synchronized to within nanoseconds of each other. This makes it possible for a GPS receiver to calculate its position within a few meters.
For a review of time and frequency measurement requirements, see the article: Legal and Technical Measurement Requirements for Time and Frequency.
How are stopwatches and timers calibrated?
Stopwatches and timers can be calibrated by making a traceable comparison to an official audio time source like the NIST radio or telephone broadcast (see the section How can I hear NIST time?). There are several valid methods of calibrating stopwatches and timers, including the use of specialized equipment.
For a complete discussion, please download the NIST SP960-12 recommended practice guide: Stopwatch and Timer Calibrations.
What is the origin of hours, minutes and seconds?
A sundial described in 1300 BCE reveals that the Egyptians determined a daily cycle to be made up of ten hours of daylight from sunrise to sunset, two hours of twilight and twelve hours of night. Their calendar year was divided into 36 decans, each ten days long, plus five extra days, totaling to a 365 day year. Each decan was equivalent to a third of the zodiacal sign and was represented by a decanal constellation. The night corresponded to about twelve decans, half a day to eighteen decans. Similar to the system used in Oriental clocks, the night was thus divided into twelve hours, with seasonable variations of the hour's length. Later, Hellenistic astronomers introduced equinoctial hours of equal length.
The Babylonians (in about 300-100 BCE) performed astronomical calculations in the sexagesimal (base-60) system. This was extremely convenient for simplifying time division, since 60 is divisible by 2, 3, 4, 5, 6, and 10. What we now call a minute derives from the first fractional sexagesimal place; the second fractional place is the origin of the second.
For a more complete discussion, see this article: "Why is a minute divided into 60 seconds, an hour into 60 minutes, yet there are only 24 hours in a day?"
What is the history of clocks, calendars and other aspects of timekeeping?
For information about the history of clocks, watches, calendars, daylight saving time, and variety of other topics, please visit the historical exhibits page.
What are Julian Date and Modified Julian Date (MJD)?
A Julian Date is the interval of time in days and fractions of a day, since Greenwich noon, January 1, 4713 B.C. This was introduced by astronomers to provide a single system of dates that could be used when working with different calendars. The Julian Date for the day beginning at UTC noon, January 1, 2010 was 2455198. For that day at 1800 UTC, the Julian Date would be 2455198.25.
Because the Julian Date is such a large number, the Modified Julian Date (MJD) is often used because it references a much more recent epoch. MJD is the Julian Date minus 2400000.5, which is equivalent to the number of days elapsed since 17 November 1858. The half-day difference adjusts the outcome so that the MJD references the modern day boundary of midnight instead of noon. Using MJD is useful for determining the interval between two days without involving calendar dates. The MJD for January 1, 2010 was 55198.
What are some sources for further reading about clocks and timekeeping?
A NIST publication called, From Sundials to Atomic Clocks (ISBN 0-16-050010-9) is a good place to begin. This book was written for a general audience and provides a comprehensive, easy-to-understand introduction to the field of time and frequency. The book and other general interest time and frequency publications can be downloaded from the NIST Time and Frequency publication database.