I am designing a CMOS Schematic in which there are both 3.3 V and 1.2 V power(0.13 um process). So when the data or clock is transferred from the 3.3V circuit to the 1.2 V circuit, I need to scale down the maximum Voltage of the data from 3.3 V to 1.2 V.
Since 1.2V is not too low, I decided to use a buffer/inverter circuit with 1.2 V power supply (VCC or VDD). With a CMOS data (3.3 V /0 V) as the input of this buffer, the output is scaled down to 1.2V (/0V). Apart from a small duty cycle error, the circuit seems to be working. But I=92ve seen other circuits which does the same (instead of an inverter switch, a nmos switch is used) but no paper/book explains properly why other circuits are preferred over a simple inverter.
My question is, if anyone has worked on mixed power supply circuits before and is there any potential problem here than I am overlooking?