Greetings,
I'm trying to write a routine to calculate milliampere-hours (MAH) for a battery charger I'm designing. The approach I'm intending to use is to sample the charging current once every second, store the first sample in a 32 bit accumulator and add all following samples to it while keeping track of the number of samples. Every ten seconds or so, I'll divide that sum by the number of samples to get the average current. If I then multiply that by the percentage of an hour that has passed I should think that would be the current MAH that has been input into the battery. Total charge time will only be a couple of hours.
Does anyone see anything wrong with this approach?
Thanks for you help.