Dear Newgroup,
I came across an interesting phenomenon:
In the lab we placed a "100M" Ohm resistor (whose resistance we couldn't measure directly with the multimeter) in series with a 10MOhm resistor (which we could measure). Building a basic voltage divider (3V battery) one should get 0.3V on the 10M resistor.
Now the internal resistance of both multimeters we have is 10M (Fluke
175, Amprobe 18-A), so we are placing two 10M in parallel. Doing the basics: the equivalent resistor combination is then 5M (sorry if that was insulting :) which would cause a 0.15V drop to be measured.Instead, on the Amprobe 18-A I read the 0.3V, while on the Fluke I get
0.15V. So apparently the Amprobe somehow realizes it is changing the voltage and corrects for it.Has anyone ever come across this before?
Any ideas on how the meter does this?
LabMonkey