I'm trying to determine the power rating I need for a current sense resistor and am having trouble calculating the effect on the resistance due to the resistor's temperature coefficient. I want to minimize the temperature's effect on the resistance as much as possible (without having to go to a huge resistor, that is) to decrease any drift in my current reading as the resistor heats up.

If a resistor has a TCR of +/-20PPM/degree-C., is the effect calculated as follows?

- Assume worst-case 100C temp rise.

- 1PPM for a .010-ohm resistor = .010/1,000,000 = .01 ohm

- 20PPM would then = .2 ohm

- (100C rise) x (.2 ohm/C) = .00002 ohm rise in resistance

- Resistance at 125C (assuming 25C ambient) = .010 + .00002 = .01002 ohms

At 10A, that would only change the sense resistor's voltage from an "ideal" of 0.1000V to 0.1002, a change of .2mV.

Is the proper way to calculate this?

Thanks for any help you can give me! John