I would think that the wider the baseline between the cameras, the better.
The cameras would need to have a known configuration, and some calibration/characterization of the resulting images would be needed.
After that, pick a pixel in the first picture, and some set of pixels around it, then look at the same group of pixels in the other pic, run a comparison to determine the shift. Scan the image in this manner and you get an array of values.
Those values should be related to the distance from the camera. edit: Those values would be relative, not absolute.
How to match up the info across many stereo pairs... maybe some kind of image tagging with gps coordinates and measured altitude above ground could help.
Edited by gbeer (27/02/2009 21:28)
_________________________
Glenn