I am putting a design together for a Data Logger that measures the current being drawn from the grid, by a domestic dwelling. Not being a rocket scientist, I am trying to keep the front end of this device as simple as possible. I am using a 0.1 ohm resistor in series with the load, using a isolating 24V transformer back to front to amplify and isolate the voltage across the series resistor, take the output from the transformer, feed this into a bridge rectifier and the output from the bridge will be my voltage to measure with the ADC of a PIC.
Put a test rig together with two AVO meters (in AC mode) to measure the current being drawn by the test load, and to measure the voltage coming out of the transformer. My test load is a 500W lamp on a simple triac based light dimmer circuit.
I did not want to get involved in doing a lot of sampling with the PIC and working out the RMS of the resulting AC voltage from the transformer, so I put a capacitor across the output of the bridge, to give a rough approximation of the voltage (ie the current being drawn by the load)
Trouble is when the capacitor is connected, the resultant voltage is always the same, if I disconnect the capaictor, the two AVO's read current and voltage in unison with the brightness of the lamp, which is what I am after.
Can somebody help me with any ideas on how I can smooth out the AC componet of the voltage I am trying to read, without complicated software, which frankly would be above my programming skills, thanks.