Re: PLL / DPLL phase question

I am desinging a Digital-PLL and I'm trying to figure out what DAC

> resolution I need to drive my VCO given a .1 degree phase accuracy > requirement (that is, reference and output need to be within .1 degrees > of each other). > > The DPLL output operates over a range of 1 Hz to 50 Hz (not too tough), > and my VCO gain is 11.8333 Hz / Volt . My phase detector is the standard > two-DFF type and can get detect a minium time difference of 20 ns in the > two waveforms. DAC output(which controls the VCO) range is 0 - 5V. > > My problem is that I'm having trouble relating my phase requirement to > DAC voltage step-size (number of bits).

Assuming that the DAC is updated once each cycle of the output frequency, you want your frequency to be within f (1 +- 1/3600), which would generate the maximum phase error, assuming that the phase was exactly matched at the beginning. That suggests that you want at least a 12-bit converter.

If you did external filtering, you could use fewer bits and toggle the DAC setting more frequently such that the average voltage is for the correct frequency.

Thad

Reply to
Thad Smith
Loading thread data ...

I'm following up my own post for a correction.

12 bits should be sufficient for the full scale frequency. Since the OP said he needed to track 1 to 50 Hz with 0.1 degree max phase error, he will need an additional 6 bits to get the required resolution at the low end (1 Hz). Note that 18 bits of accuracy aren't needed -- only 18 bits of resolution, because of the feedback. The resolution could be achieved with 2 10 bit or 12 bit converters.

This is still true and could be used instead of another converter.

Thad

Reply to
Thad Smith

Thad -

sounds good in theory - but there has never been a 12 bit DAC with 12 bits of reality - the LSB is junk with power supply noise and non-linearity. The 12 bit DACs are really capable of 11 - 11.3 bits with GOOD (excellent) gound planes and linear regulators and a cold plate for temperature stability.

The phase error means that his load >Thad Smith wrote:

he needed to

bits to get

aren't

could be

Reply to
Andrew Paule

Thanks for the responses everyone. I'm still a little puzzled, but it's gone from dark to murky =).

news:...

he needed to

6 bits to get

I'm on the same page with regard to the 1/3600 part at a frequency.

I think what Thad is saying is that I need 12-bits at a given frequency, but to get to a given frequency I need more bits. If my range is 1-50Hz just to get to a frequency in that range (if I could do it in integer multiples of Hz) I'd need 6-bits assuming a 1:1 correlation between a bit and output Hz. Then I would need an additional 12-bits to do sub-frequency control to meet my phase requirement.

Also,I think that the VCO's gain factor comes into play and will affect the number of bits, given I don't have a 1:1 correlation between a bit and a Hz.

Thanks again for the responses...

-- Jay.

Reply to
se10110

For this application, the OP has error feedback, which removes the need for long term stability, accuracy, and good linearity. Noise would be more of a problem, though.

The m and n integer? What do you mean here?

Thad

Reply to
Thad Smith

Well, I cheated just a bit, I don't have a VCO, I have a motor control unit. I'm varying the voltage to the motor control unit to get a "frequency" out of it. My VCO "gain" is really the motor gain (RPM/Volt translated to Hz / volt). I didn't want to complicate the situation by bringing that in (being a motor and not a VCO doesn't alter the number of bits question)

Using PWM would work for my application upto 11-12 bits(given a 40-60MHz input clock), but beyond that my PWM output frequency drops too low. I haven't solved this problem yet...dithering may work here.

I also considered using a real DAC, buffering the output and driving an SMPS in voltage-control mode (to drive the motor) but as others have pointed out, the DAC noise problems will probably kill me.

My newserver has been acting funny(read not letting me post), otherwise I would have tried to clairify a bit earlier.

But I'm curious Peter, assuming I used a 50 to 100MHz clock, the only way to get the delays would be to make a (big) shift-register and delay my signal by clock cycles, right? Wouldn't that mean a huge multiplexor on the output to select which tap I use?

How exactly would multi-phase 200MHz clocks work out here? Generate a

0deg and a 90deg signal using the DCM on a Xilinx part or something?

Thanks again for all the helful responses everyone.

-- Jay.

Reply to
Jay

Reply to
Andrew Paule

That introduces some additional considerations. A VCO would be fairly stable for a given control voltage. Is this true of the motor? If there are load variations, can the controller keep it within the narrow speed window to maintain your phase margin. Even if the controller increases the drive to compensate for a load increase, it probably won't do anything to recover accumulated phase error. If you absolutely need

0.1 degree maximum phase error while the load changes, you might need a much stiffer motor drive, as well as immediate feedback from the motor to the controller, probably a high resolution rotary encoder. If there is very little load change or your phase error limit can be exceeded at times, it won't be as bad.

Thad

Reply to
Thad Smith

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.