I have an application with a Xilinx Spartan3 FPGA where I would like to use a single FPGA binary to support to I/O voltage levels: 2.8v and
1.8v. My question is as follows:Why does the UCF file include the selected IO standard for each pin? I understand that the drive strengths and slew may change based on the I/O standard. Are there any other functional hardware changes made based on the selected standard? What are the potential consequences of telling the compiler that I am using 2.8v but then running the application with 1.8v?
Thanks