Okay, this is a homework problem, but I'm middle-aged and already have my B.A., and just returning to school to pick up a little more knowledge. Besides, you can clearly see (below) that I'm giving this my best shot. So, someone please help me out here. For a basic electronics class, we are given that the temperature coefficient for the resistance of a material is given by:

a (for alpha) = (1/R)(dR/dT), and we are asked to show that:

R2 = {[ 1 + a(T1 - Ts) ] / [1 + a(T2 - Ts)]} R2

where Ts is the "reference temperature".

But here is how the math works out for me:

a = (1/R)(dR/dT)

dR/R = a dT Take indefinite integral of both sides....

ln R = a T + Ts, where Ts is said reference temperature

Assume R1 corresponds to T1, and R2 to T2, then....

ln R1 = a T1 + Ts, and ln R2 = a T2 + Ts

ln R1 - ln R2 = a (T1 - T2), so Ts is already gone.

ln (R1/R2) = a (T1 - T2), exponentiate both sides...

R1/R2 = exp (a [T1 - T2])

exp x is approximately 1 + x, so we have,

R1 = R2 { (1 + a [T1 - T2]) }, which is not the professor's desired result. Did the professor screw up, or have I forgotten some basic calculus or algebra in twenty years since college? Thanks in advance for all replies....

Steve O.

"Spying On The College Of Your Choice" -- How to pick the college that is the Best Match for a high school student's needs.