I am preparing a hiking database, showing distance, start/end elevations, elevation gain, anticipated time, etc. for various hiking legs. I have defined 86 waypoints, and some 200 possible sequential connections among them. I have GPS coordinates for each waypoint, but elevations are problematic because I cannot trust the elevations from the GPS, a Garmin 64ts.

How can I determine elevations from GPS coordinates? For example, the door to my home is at 20°18.237'N 103°15.804'W. My GPS gives varying altitudes for the same locations on different days, sometimes differing by as much as 100--200 feet.

The Garmin can display elevation from both barometric and GPS sources. The barometric readings are calibrated to a "known" altitude before every hike: according to Google Earth, the altitude at street level next to my door is 5,204 feet. I realize that Google Earth's elevations are just an interpolated average between topographical lines and in hilly terrain where I live this is likely to err by hundreds of feet. Presently I am using elevations that are the average of as many as 10 different samplings from the GPS over a period of months. Just sitting on my desk as I type this, the GPS elevations shown have varied by 26 feet barometric, and 37 feet GPS.

One large contributor to the inconsistencies I get is that there is a considerable lag between arriving at a waypoint and the GPS settling down to a reliable (?) elevation reading, particularly if there is a large change in elevation in the previous few minutes. It can take five minutes or more for the readings (both barometric) and GPS) to stabilize. If I have just arrived at a waypoint that is, say, 250 feet higher than the previous waypoint, the GPS will generally show an elevation that is about 50-100 feet lower than the true elevation. Over the next 5-10 minutes, I can watch the elevation numbers advance, climbing one foot at a time until they finally reach the correct elevation. Except that if I wait another 5-10 minutes, they might (or might not) continue climbing past that elevation while I stand perfectly still, watching the GPS screen.

The GPS distances seem to record pretty accurately, and the GPS coordinates seem to be spot-on, at least when compared to Google Earth’s coordinates for known locations. The elevations… not so much.

So, I’ve got a set of believable GPS coordinates. How can I get reasonable elevations for those points? Absolute accuracy is not what I’m looking for. Every one of them could be 400 feet off, as long as it was the same 400 feet in every case. But I want the elevation difference between point A and point B to be the same every time I measure it, plus or minus 10 feet if possible, 25 feet at most.

How can I do this?

tanstaafl.
_________________________
"There Ain't No Such Thing As A Free Lunch"