Frequency Biases Associated with Distributed Cavity Phase and Microwave Leakage in the Atomic Fountian Primary Frequency Standards IEN-CSF1 and NIST-F1
Steven R. Jefferts, Jon H. Shirley, Neil Ashby, Thomas P. Heavner, Elizabeth A. Donley, F Levi, Eric A. Burt, G J. Dick
The subject of frequency shifts in atomic frequency standards caused either by distributed cavity phase or microwave leakage goes back to the earliest days of the thermal beam standards [1,2], and has been the subject of continuing theoretical and experimental work over the last fifty years. [3-7]. Laser-cooled fountain frequency standards pose different problems with respect to both distributed cavity phase and microwave leakeage than the thermal beam standards. This is due both to the very different microwave structure used in fountains as well as the lower center-of-mass velocity and very narrow velocity distribution which allows operation at significantly elevated microwave power in fountain standards.
, Shirley, J.
, Ashby, N.
, Heavner, T.
, Donley, E.
, Levi, F.
, Burt, E.
and Dick, G.
Frequency Biases Associated with Distributed Cavity Phase and Microwave Leakage in the Atomic Fountian Primary Frequency Standards IEN-CSF1 and NIST-F1, Proc., European Frequency and Time Forum, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=50108
(Accessed December 8, 2023)