An Atomic Clock is a type of clock that uses an atomic resonance frequency standard to feed its counter. Early atomic clocks were masers with attached equipment. Today's best atomic frequency standards (or clocks) are based on absorption spectroscopy of cold atoms in atomic fountains. National standards agencies maintain an accuracy of 10-9 seconds per day, and a precision equal to the frequency of the radio transmitter pumping the maser. The clocks maintain a continuous and stable time scale, International Atomic Time (TAI). For civil time, another time scale is disseminated, Coordinated Universal Time (UTC). UTC is derived from TAI, but synchronized with the passing of day and night based on astronomical observations.
The first atomic clock was built in 1949 at the U.S. National Bureau of Standards (NBS). The first accurate atomic clock, a cesium standard based on the transition of the cesium-133 atom, was built by Louis Essen in 1955 at the National Physical Laboratory in the UK. This led to the internationally agreed definition of the second being based on atomic time.
In August 2004, NIST scientists demonstrated a chip-scaled atomic clock. According to the researchers, the clock was believed to be one hundredth the size of any other. It was also claimed that it requires just 75 mW, making it suitable for battery-driven applications. This device could conceivably become a consumer product. It will presumably be much smaller, much less power-thirsty, and much cheaper to make than the traditional cesium-fountain clocks used by NIST and USNO as reference clocks.