I only have a cheap Radio Shack 3.75-digit auto-ranging multimeter (22-803), and I want to calibrate the DC voltage reading. The ranges break at 400mV,
4V, 40V, 400V and 1000V. On opening the case I find two pots labeled VR1 and VR2, and I would bet money that VR1 is the one to adjust. But I should be able to confirm that by connecting a 1-meg resistor from the wiper to one end of VR1, and see if that changes the reading.Anyway, I don't understand how auto-ranging meters work. In particular, if I find a really accurate 3.6V source, and calibrate to that, does that say anything about how accurate a 12V source would be read, which would be in a different range? Would it be better to calibrate to the 12V source, and would that automatically make the 3.6V source read correctly, or are the two ranges completely independent? Well, I know they aren't *completely* independent since I assume VR1 is most likely *the* V-ref for all purposes for the IC on the other side of the PC board. But I would guess that each range has its own resistors which would make that range subject to the accuracy of the resistors. And there certianly aren't pots for each range in this meter. So it may be that you can only make the meter accurate within one range. But as I say, I don't really know how auto-ranging meters work, so the stuff about resistors may be all wrong.
If anyone would like to explain how these auto-rangers work, and what my calibration options are, I would appreciate it.
I have a Digikey order coming up, and see that I could add a TI LM4132 3.3V reference for about $3, and/or the 4.096V version, which should put me into the next range. These are rated .05%.