An alternative using a conventional VCO-based PLL with an FPGA would be to implement most of the PLL as a DDS. The FPGA would have a phase accumulator and BRAM-based sine LUT and would output a sine to a cheap 8-bit DAC. The output of the DAC would be reconstructed with a simple lowpass (with a simplicity based on the oversampling rate) and then squared with a comparator to make a clock of any desired frequency. The comparator would do the job of placing the edge of the clock at the correct interpolated point between DAC samples. This has several advantages. You can synthesize any clock with any crazy multiplication ratio without fractional-N techniques. You can dither or spread the clock easily and digitally control overshoot when transitioning. You get rid of a bunch of analog hardware, including a VCO and a loop filter and charge pump, all of which have varying characteristics from part to part. You could even generate really fast clocks by bandpassing one of the DAC's images. Yet I rarely see this technique used. Is is just that it's used more than I think, or does it have some disadvantage like introducing a lot of phase noise?