Frequency Standard - Rubidium or GPS?

gv3MFwPXnZ2dnUVZ snipped-for-privacy@supernews.com:

l
y

Here's how I calibrated against WWV

Get a receiver that has a speaker or headphones output Connect a reasonable frequency counter to the headphones Tune it to WWV Get a synth and tune it to WWV-N Hz (N=3Dlets say 300) The counter will now show the 300Hz plus any error in the synth.

The nice thing about this is that the counter etc doesn't need to be very good to get the synth's calibration just about bang on.

Reply to
MooseFET
Loading thread data ...

There is no compelling reason for that. Both GPS and rubidium standards are disciplined crystal oscillators and for short term stability both depend on their crystal only.

Rb may have an advantage wrt holdover, other than that the higher system price of Rb could justify a slightly better crystal oven.

Gerhard

Reply to
Gerhard Hoffmann

There are several off air frequency standard designs that fairly regularly get recycled in the various magazines. A bit like egg timers.

In the UK at least it is trivial to take a long wave 200kHz coil and slug it with a bit more capacitance to tune for either 60kHz Rugby or

77kHz DCF77. Although monolithic chips are cheaply available a discrete receiver is easy enough - even the original transistor radio would do.

A reasonably helpful generic page is online at:

formatting link

Using the Rugby off air signal to synchronise local Rb oscillators was used in the late 1970's by Duffet-Smith et al to provide local oscillators for remote VLBI stations at low frequencies.

ISTR they could easily detect the presence or abscence of dew at the Rugby transmitter from the diurnal variation of phase errors.

Regards, Martin Brown

Reply to
Martin Brown

The 299 khz massive beacon spike on HDTV in the US is easily seen on a spec an. Its supposed to be on a broadcast carrier that is within +/-

2 Hz,no small feat at UHF. I'm just waiting to see someone tap that resource. +/- 2 Hz at UHF means the station needs some sort of stable timebase for the master PLLs.

Meanwhile, back at the ranch, my rubidium draws almost 2 amps of 24V at startup, and does so for about 20 minutes. It then drops down to

500 mA or so. You might want to keep that in mind.

Steve

Reply to
osr

Just because it's cheap doesn't automatically make it crap. It keeps time as well as any clock I've ever seen, and presumably synchronizes itself to the WWVB signal; I could put it on my scope/freq. counter, but I'm not that ambitious. It was just an idea.

Thanks, Rich

Reply to
Rich Grise

Something about 100 ms accuracy.

Back again to the Goldstone Apollo station. In back of the Collins timing system was something that was not used anymore when I started working there. A Marantz Model 9 mono tube amplifier to drive wall clocks ! You still had to set the clock manually.

greg

Reply to
GregS

I never implied that the clock was crap. I implied that your concept of using it to calibrate a counter in a minute was crap.

Sixty seconds of a one hz signal is...wait...let me do the math... oh, yes, it's 60 counts.

You are NOT gonna get 10^-6 accuracy out of counting sixty counts...ever.

But wait...my counter has a time interval measurement. Ok, what's the risetime of the signal, jitter, accuracy of the time-interval function as specified by the counter manufacturer. There's a reason you don't try to calibrate a counter's reference in time-interval mode. Averaging over 60 clocks helps by a factor of 8 or so on a good day for random errors. Doesn't help much on systematic errors.

Assuming the average accuracy is good, there are ways to generate a stable reference, but not in 60 counts.

I just wanted to see some math supporting your assertion.

It keeps time

Reply to
spamme0

Risetime? jitter? wtf. If you get a stable reading it is a stable reading. Crude count the 10MHz reference clock period measurement gives you 1 part in 10 million resolution for a 1Hz signal.

I had no problem calibrating my counter/timer from a GPS pps output. Get as close as you can with period measurement then scope the counter 10MHz reference triggered by the pps. Trim for zero drift. The limiting factor was resolution of the reference trimmer not jitter or observation of drift.

Reply to
nospam

An ordinary counter gets 10 digits on one count...of the gate. Reciprocal counters can get right down to the jitter level, so even assuming that the clock's 1-Hz pulses have 1 us of rms jitter--very unlikely--you'd be down in the low parts in 10**8 in 1 minute. The clock has long-range order (i.e. the mean positions are very accurate, they just jiggle around a bit). How long you have to go to get that accuracy depends on how often the clock gets updated. Assuming that it's within 100 ms per day, that's just over 1 ppm.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal
ElectroOptical Innovations
55 Orchard Rd
Briarcliff Manor NY 10510
845-480-2058
hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

Well...it's been 30 years since I was the hardware design manager for a counter company....and I like to learn new things. While what you say is spot-on theoretically, there are practical issues to deal with.

My input was related to: _______________________ I just want to recalibrate my so-so frequency counters and Well, for what it's worth, my $10.99 plus tax wall clock is sync'ed to WWVB; I'm sure there's a 1 Hz pulse in there somewhere. :-)

I'd imagine it would make a pretty good reference if you wanted to wait, say, 60 seconds or so to get a good reading. _______________________

I asked about the math that allowed the specified source to calibrate the specified counters....vague as those specs are.

So, the first question to you is, "with your method, how many reference frequencies exist that produce a zero drift display on your scope?" How do you tell which is the right one?

Reply to
spamme0

As many as there are Hz in the counter reference trimming range.

You measure the pps period to get 'on' the right Hz first, you can measure period after to confirm.

Reply to
nospam

Got it... Assume the counter can measure period with more resolution than is implied by the reference clock rate.

I use the period function to measure the 1Hz. and adjust the 10MHz oscillator to better than a part in 10^-7. Then I set my oscilloscope sweep speed to 1uS/division...oughta be able to easily resolve 10-cycles/division...and trigger the scope on the 1Hz. I get a display that sweeps once/second and I dial it right in. I just tried it... I don't think the trigger jitter spec on my scope is anywhere near a part in 10^7, but it don't matter, cause I couldn't see anything. Your oscilloscope has a LOT better writing rate than mine!!

While your approach is theoretically sound, I maintain that the average person with average equipment can't take a so-so counter, a $11 wwvb-disciplined wall clock and (assuming they even have) an average oscilloscope to implement it in SIXTY seconds or so of measurement time as suggested by the original suggestion.

The devil is in the details.

Reply to
spamme0

Digital scope, you are only looking at 10MHz doesn't have to be a fast one. Jitter from the trigger and pps wasn't enough to cause any problem observing drift. The pps jitters was maybe 30 or 40ns, and it was 'zero' or a whole 30 or 40ns, obviously synced to some about 30MHz clock in the GPS module.

Pulling a pps signal from an old hand held GPS and using test equipment I already had let me calibrate my counter/timer with no cost.

I doubt a wwvb wall clock would contain any useful signal. I doubt they lock on to anything, just occasionally correct their time from the time code signal.

Reply to
nospam

Here's a nifty method using a VNA for phase comparison of oscillators:

formatting link

-- Mikko

Reply to
=?iso-8859-1?q?Mikko_Syrj=E4la

That's because during network broadcasts the frequency WAS dead nuts on! The networks used rubidium standards for it. Local stations often did not. And it was even better than that. NBS tracked all three networks and gave exact measurements to primary standards. It doesn't get any better than this. Oh wait. It does get better!!! NBS also tracked the frame times so as to provide accurate time signals referenced to NBS standard clocks. This system was amazingly cool and accurate to astounding degrees. I had a temperature controlled XO clock that I calibrated with this system regularly as well as my counters. Just too cool for words!

But like most things in life, things that are too cool must be destroyed.

And that is the story of where that wonderful system went. NBS really had no control over the system and the networks trashed it for their own convenience. Digital TV put the final nails in the coffin.

So yeah, you can buy your own rubidium standard, but if you want to give it a decent calibration you have to carry it to boulder or some other place with a primary standard. NBS radio transmissions are so

1940 and work about as well. Yeah they keep your "atomic clock" within seconds but think of the phase accuracy needed and how long you'd have to measure to get some decent timing. Personally I'd go with the rubidium standard. But if you can find a place to calibrate it, you also can haul your counters there as well.
Reply to
Benj

No, it wasn't compatible with some newer NTSC broadcast equipment. To meet the tighter video switching requirements, a framestore was needed and that is what stopped the network fed colorburst from being transmitted. Also, Most network stations only have limited live network feeds anymore.

We used a Fluke 207-5 VLF receiver to pick up WWVB at Microdyne. The

60 KHz output was fed to the plant's frequency standard, which was phase locked to WWVB. It was used that way for almost 20 years, when it was replaced with a GPS based standard. That 10 MHz signal was fed to a one input, 32 output distribution amplifer, which was fed to the external input on signal generators, and frequency counters. In fact, a few of our HP 5245L counters had bad internal oscillators, and this allowed them to be used for time period measuremnts for the AGC systems.

The older Fluke system was kept as a backup.

--
You can\'t have a sense of humor, if you have no sense!
Reply to
Michael A. Terrell

TEK.

a

it

5245L

for=20

=20

=20

subcarrier, when locked to the transmitter, will be:

frequency standard readily availabe.

They were/are pretty accurate at nominal frequency, but in the TV receiver there were locked to the incoming signal, always were, it wouldn't work otherwise.

Reply to
JosephKK

Actually the signal is a bit faster and not very reliable. Moreover it is disciplined by better clocks.

Reply to
JosephKK

OK, let it run all day or all week - eventually you'll get a usable reading.

My point was merely that it's doable, if you want to wait long enough. :-)

Thanks, Rich

Reply to
Rich Grise

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.