I have (or will have) a 1000 bar pressure sensor, it's a full bridge with the resistance given as 3500 ohms +/- 20%. I've no way of pressurising it accurately.
But it's supplied with calibration data comprising sensitivity and a list of offsets against temperature, and I can measure the temperature. So far, so correctable.
However, the data supplied is in the form of mV offset against temperature along with a single mV/bar figure *when the bridge is driven from a constant 1mA*. I don't want to drive it that way, I have a voltage reference and can use the differential ADC ratiometrically. The sensor would end up being excited by 2.500V which would be about 700uA which is within the allowable range.
I feel there must be some way of using the supplied calibration offsets, but I'm not sure exactly how. (Also, doesn't everybody do bridges ratiometrically?)
[Constant current is tricky with only a 3.3V supply, though the data sheet says I could go as low as 0.5mA, but I don't really want the extra complexity.]Ideas as to how I'd do the math(s)?