Using standard RMC NMEA strings from a GPS unit, I'm trying to figure out the best way to calculate distance traveled over time. The two methods I have are:
- Use lat/long. coordinates and calculate distance using Haversine method (or similar) each sample, and keep a running total of distance traveled
- Use velocity each sample, and multiply by sample interval to get distance traveled during that interval
I have done some basic tests, and these two methods work out to be "almost" the same, but not quite... I don't know enough about the underlying GPS design to determine how lat/long vs. how speed are determined. Any comments on which method may be more accurate? (or are they both equally inaccurate).
What is interesting however, is when going under a bridge structure, the velocity readings become distorted (there is a jump and then a dip). The lat/long readings don't seem to suffer from this same issue however. Does anyone know why this is??