I'm talking about the drift of the timebase used as a counter reference. You could always go with a rubidium standard such as a Ball-Efratom, or use something which is phase-locked to WWVB.
Spectracom made numerous examples of the latter. One of their offerings actually measured and charted drift as a function of propagation (atmospheric) variance. It output the difference as a proportional voltage which lent itself to automatic compensation schemes.
If a total drift figure could thus be extrapolated based on a frequency/propagation-corrected model then used to calculate an offset for an observed signal, one might be able to obtain still more accuracy when making the measurement - yes?