I've been working on a PCB that drops high voltages (+/- 3.2KV) down to +/- 2.4V using a simple potential divider circuit of a 10Gohm (mini- mox) resistor (R1) in series with 2, 15Mohm (MRS25) resistors (R2, R3). The signal is connected to a Data logger with an input impedance of 10Mohm's. The High voltage is switched using high voltage reed relays (Pickering). Due to testing purposes I'm exciting the coils using a Power supply as opposed to the Digital output pins of the logger. The PCB ground is connected to the High voltage unit ground.
With an input Voltage of ~ 24V from the High voltage unit I'm recording an output of 0.4V as opposed to the correct value of0.020V. This is observed from the data logger software.
I thought the difference in value could be due to noise from the 10G resistor so I've used a 100NF cap (ceramic) to ground from one end of the 10G resistor to ground. This results in an output of 4.0V!
I've then taken the cap and placed it at the terminals of the data logger (between Analog in and common) and recorded a value of 1.0V!
Strangely the values of 4.0V and 1.0V don't appear to change much indicating more of a D.C effect than A.C noise. Does anyone have any pointers?