I'm trying to set up a simple testbed program to read three analog voltages on A0, A1, and A2 (pins 2, 3, and 4, resp.) of a PIC 18LF2525. The Vdd is+5.0VDC. The ADC is set for 16 bit operation. The ADC clock is set for internal operation. The program sets a channel, waits for 20us for conversion to complete, reads the value, and scales it based on a 5V full scale. When all three channels are read and scaled, the program outputs a message to the RS232 port connected to a HyperTerminal session. The message is sent every 10 seconds.
I tried pulling each input to ground through 100K resistors to get a 0.000V reading but I saw significant voltage fluxuations (over 0.1V) on each channel. I tied a 0.1uF capacitor across each resistor and that seemed to improve the readings on channels 0 and 1 (they stayed pretty consistently at0.000V) but did not eliminate it on channel 2. It did reduce it, though, to about 0.014V - 0.019V. In looking at the Vdd line with a scope, I saw a significant dip (~100mV) in the Vdd voltage during the message output every 10 seconds. I tried setting up an external Vref+ on pin 5 with a dedicated +5.0VDC regulator and a 10uF and 0.1uF cap to ground. The Vref+ line looked very stable on the scope and channels 0 and 1 stayed consistently at 0.000V but channel 2 is still varying (0.014V to 0.019V).
By the way, I've tried setting the delay time (after setting the channel before reading the value) to up to 1ms with the same results.
Has anyone else experienced this sort of behaviour? Is there a solution for it?
I'd really appreciate any thoughts or similar experiences (or better yet examples) of how this problem can be solved.