Hello all,
As I am a newbie in the world of theoretical and practical electronics I have come across quite a few ares that I have questions in. One subject that I am not clear on is the level needed for accurate measurements via an analog voltmeter and/or a multimeter.
I understand that, as per the mathematics, the higher the impedence value the more accurate the measurement from the meter. However I was wondering what would be the necessary level of ohms/volt that an analog voltmeter and/or a multimeter would need to operate at in order for the measurement(s) being displayed to be considered accurate for the testing of compuers/arcade PCB's, as well as general trouble shooting of other common electrical devices (phone, PDA, etc.). Is
20,000 ohms/volt generally satisfactory, or does one really need a 50,000 ohms/volt meter?As a note, I have a digital multimeter that is rated at 4 megaohms, however, I am interested in the analog multimeter so that I can see any spikes that may be produced by the electrical device.
Thank you all for your time and advice!
Respectfully,
Sam