Hi,
I have my motion control-system setup for a velocity loop. Encoder pulse periods are timed to calculate instaneous frequency and are subtracted from a reference value to form an error signal. The timing and error generation occurs in an FPGA, the control-loop is implemented as a PID in a uC which samples the error register in the FPGA at 100 Hz.
My issus is regarding noise: I have a lot of jitter on my encoder signals (due to mechanical vibrations I think) which causes the error signal to jump around quite a bit.
So, is there any reasonable way to remove the jitter using digital filters (FIR, IIR?) and not run into aliasing probelms due to sampling?
Since the data "starts out" digitally, I don't have any bandlimiting low-pass filter as I would if this were an analog signal going to an A/D. Using a scope, I think I can safely say the encoder counts jitter at a frequency > 100 Hz.
My thinking is the jittering causes an FM effect on the encoder input frequency so I'd want as high of a sampling rate as possible for a filter, since I can't attenuate above Fs/2.
I'm guessing if I want an FIR/IIR filter I'd need to run it at 5 MHz+, but since the data is only available on encoder ticks, what would I be sampling between encoder ticks? The "all digital" aspect is a bit of a mind bender.
I know a lot of people deal with noise by implementing moving average filters, is this an alternative? How would this get around aliasing?
Suggestions highly appreciated!
Jay.