I'm investigating the intrinsic distortion of uniformly-sampled PWM as it affects the PFD output in a fractional-N synthesizer. One thing I've tried is to replace the complex MASH-generated divide-by-N sequence with a much simpler square wave. This is what the resulting PFD output spectrum looks like:
The PWM sampling rate (reference/comparision frequency) was 512 and the square wave frequency was 64. You can see the fundamental and 3rd harmonics of the square wave, but there are also tones at the even-order harmonics which I think might be distortion products.
What interests me is the way the amplitude of the even-order tones seems to increase with frequency. I thought the amplitude of an IMD product could only be proportional to a power of the input amplitude. I don't understand how it could also be proportional to frequency.
When I run my simulation using the real MASH sequence, I get the PFD output spectrum I would expect, but there is a noise signal rising at 20 dB per decade superimposed on top of it: i.e. distortion with amplitude proportional to frequency.
I don't know for sure if these tones are PWM distortion - maybe they're due to a loss of floating-point precision in my simulation; but I'm using the lcc-win32 C compiler because it has a "long double" type which supports the full 80-bit precision of the Intel FPU.
Any pointers would, as ever, be appreciated ...