I want to program an FPGA to generate two-dimensional,
10-bit video white noise. Any suggestions? I'm not sure if an independent LFSR per bit would work. If so, would I want different lengths per bit, or would I just initialize the different bits at different times?
If they're running on the same clock, you are likely to see an obvious pattern to the bits, especially if they are the same length LFSRs.
If you are using an FPGA that has hardware multipliers (as most do nowdays), I'd suggest using a linear congruential generator:
What you want to avoid is a modulus that isn't a power of two. The first set of coefficients listed in the "Example LCGs" section uses a modulus of 2^32. It generates a 32-bit pseudo-random number for each multiply, so you could get three 10-bit video samples per iteration.
In typical Xilinx FPGAs, you get 18-bit signed multipliers, so you would build a 32-bit unsigned multiplier from four of those and some adders. Maybe the tools can infer that from a Verilog or VHDL multiplication operator, or maybe you can use Coregen.
If your data rate requirement isn't too high, you can implement the 32-bit multiplier using a single hardware multiplier cycled four times.
It is possible that an LCG will still result in too much pattern for your application. If so, it is possible to implement the Mersenne Twister in hardware. While it is not suitable as a source of pseudorandom sequences for cryptography, it has a long enough period (approx. 10^600) that it should be suitable as a general noise source.
You can use an LFSR but use one that skips ten steps forward in the sequence on each cycle so that the lower ten bits are not correlated from cycle to cycle. If you used, for example, a 17-bit LFSR, you would cycle through all 2^17-1 sequences, since 2^17-1 mod 10 = 1. If that makes sense. I forget what this kind of LFSR is called, but they are easy to make, because you can usually write a double loop in HDL and a good synthesizer like Synplify will take care of the rest. -Kevin