Time keeps on slipping, the song says, but thanks to researchers at the University of Tokyo, it is slipping a little bit less than it used to. New cryogenic optical lattice clocks use freezing temperatures to slow the inevitable loss of time to only one second in 16 billion years.
Since that is about two billion years more than the current age of the universe, we shouldn’t have to worry about fiddling with the second hand for quite a while yet.
The clock measures time, as do all the most accurate of human timepieces, by tracking the vibration of atoms. The team at the University of Tokyo, led by Hidetoshi Katori, achieves this miracle of timekeeping by creating a lattice of coherent light beams at a very specific wavelength to avoid introducing any transient vibrations. Since heat, at the atomic level, is expressed largely as motion, cooling the grid to -180 C also helps reduce potential inaccuracy. Strontium atoms are trapped in the grid and the measurement of their vibration, like the ticking of sub-microscopic pendulums in a miniaturized grandfather clock, is what provides the ticks of the most precise timekeeping device ever created.
This isn’t a completely unheralded advance; last year, physicists at the National Institute of Standards and Technology (NIST, the U.S. agency responsible for maintaining the official time for the country) and JILA (Joint Institute for Laboratory Astrophysics, a venture between NIST and University of Colorado Boulder) unveiled a strontium-based clock operating on similar principles which was able to reduce time loss to about one second in every five billion years.
The devices are not precisely official, in that they may be more precise than the standard International System of Units definition of a second, which relies on the radiation levels emitted by the cesium 133 atom at a ground state of zero degrees Kelvin. But as repeated experimentation with strontium and ytterbium atomic clocks has proven to be more precise and stable than cesium-based devices, they may eventually supplant the cesium clocks as the standards.
The secret behind the increased stability and accuracy isn’t simply the cold temperatures or the laser matrix, according to this NIST press release, but the fact that the strontium clock operates at optical frequencies which are much higher than the microwave frequencies used by cesium clocks.
It’s not much use to try to judge a clock’s accuracy when you only have a single clock, so the University of Tokyo team constructed two in order to have something to measure against. Accuracy is then ascertained by comparing the values of the two devices.
Katori expresses hope, in this Tech Times article, that the advent of the improved strontium clock will accelerate a re-definition of the official international of a second to a more accurate measure. He also hopes that the devices, which are currently large devices resembling mainframe computers, can be sufficiently miniaturized for field use.