Let's say you're measuring voltage across a sense resistor on a dc line where the voltage with respect to ground fluctuates between 12 and 15 volts. Your amp has a differential mode gain of 10, so when the sense resistor voltage is 100 mV, the amp output is 1 V. So far so good. But I get confused with common mode rejection ratio. Is it with respect to the absolute common mode fluctuation, or is it with respect to the proportional variation? Let's say your common mode rejection ratio is 40 dB. That's 1/100. Does that mean
the absolute error in your amp output equals 1/100 of the absolute 3 volts common mode fluctuation, i.e. 30 mV; or
the proportionate error in your amp output is 1/100 of the proportionate fluctuation in the common mode, or 1/100 of 25%, which would be an error of 2.5 mV when the amp is putting out 1 volt.