I used to believe that ChipScope did not have any impact on the design. I used to.
I have two FPGAs communicating on the PCB (LVDS). One of them (called source here) is doing some signal processing and sends the result to the destination FPGA. If I probe (with ChipScope) some internal results of the processing (FFT output) in the source, the data looks fine at the destination. However, if I remove the probe I get some interesting, but rather annoying, bit errors in the data received (and probed) at the destination.
Before probing at the source I thought this was an issue of signal integrity on the PCB, but the probing proved me wrong on that point. The source is a V2000 device and without the probing about 90% of BRAM and mults are utilized. I would think that by adding a ChipScope core PAR would have more troubles meeting timing constraints, and consequently deliver a design more prone to bit errors...
Maybe delivering the design with ChipScope still in it is my only choice, but it doesn't feel very good.
Anyone who have had similar experiences?