How do I specify the use of the F or G LUT in a slice? Can I? Does it matter? Why? How do you reduce wasted resources if you can't specify this in an RPM?
The following code:
//synthesis attribute RLOC of BIT[0].SRL is X0Y0 //synthesis attribute RLOC of BIT[1].SRL is X0Y0 //synthesis attribute RLOC of BIT[2].SRL is X0Y1 //synthesis attribute RLOC of BIT[3].SRL is X0Y1 ...etc.
places SRLC16's from a Verilog "generate" block, but I can't seem to control which of the two LUT's is used for pairs of bits.
The module in question generates a SRLC16-based delay block of variable (at instantiation) word size and delay. This is a building block for a FIR filter and I'd like to control placement as tightly as possible. Instead of getting a nice upward or downward sequential distribution, the above produces a sequence like this (assuming an 8-bit word):
BIT[6].SRL (top-most slice SRL) BIT[7].SRL BIT[4].SRL BIT[5].SRL BIT[2].SRL BIT[3].SRL BIT[0].SRL BIT[1].SRL (bottom-most slice SRL)
When, what I really want is:
BIT[7].SRL (top-most slice SRL) BIT[6].SRL BIT[5].SRL BIT[4].SRL BIT[3].SRL BIT[2].SRL BIT[1].SRL BIT[0].SRL (bottom-most slice SRL)
Thanks,