So,
A friend and I are having a polite debate.
He's calculating the current in a circuit my measuring the voltage drop across a 20ohm resistor. (Current is ~30mA range for refernce)
I think best approach is to put a mA meter in series with circuit and read the meter. He disagrees and said his method of measuring the voltage drop across the resistor is more accurate.
His claim is the mA meter adds resistance to the circuit and under reports the results. I agree in theory with his statement, but disagree on the magnitude of the error.
I claim the amount of error in both the ability to measure 0.6vdc and the error in the certaitnty of his "known" resistance is of a larger concern.
For the record, were just simple tinkerers, using simple (but not junk) tools, working remotetly from each other. No one's life or livelyhood depend on the results.
If a little background will help, here's the circuit.
5vdc -> led -> resistor -> led -> pin on shift register -> gndLeds drop ~2v each (rated at max of 30mA current) Resistor is 1/4 watt carbon film 20ohm +/- 5% variety Shift register on resistance is rated at 6.5ohm at 50ma at temp of 25c
I agree with him that he should use a 33ohm resistor for safety sake, but would like debate resolution over using a mA meter :)
So, what's the collective opinion on this?