I've decided to russle up a simple electrical tester I've had in mind for some time. I'll say at the outset I'm intending this to be simple and built with a bare minimum of design effort and circuitry.
The basic idea is for a tester with no mode or range switching (from logic levels up to mains voltage) and delivers all information audibly via different tones. The tests I want to run are polarity (subject to a minimum threshold of 1-1.5V to make it logic compatible), AC/DC (which in terms of detection is simply a fast polarity test) and continuity.
Detecting voltage between the probes is a no-brainer but this continuity test is giving me pause for thought. Remember there's no mode switching, so anything we put on the probes mustn't materially affect the vast majority of circuits even when they are operational
- you'd never want to test voltage on a dead circuit after all. The obvious way would be to place a signal on one of the probes at high impedance so it can be detected by the other probe, or easily over-ridden by the circuit under test if the probes are not connected.
However, a high impedance output would seem to make it difficult to ensure that the "continuity" in question is a good one - I'd consider a resistance up to a few tens of ohms as continuity, but hundreds of K is intervening circuitry or even leakage. I don't see how you can discriminate between the two when the output resistance is by necessity 1M or more.
Therefore I'm thinking I must be heading down completely the wrong path. Does anyone know how commercial gear deals with this kind of issue?