Visual Odometry

A small discrepancy can lead to a great error. — Chinese Proverbs

We present an approach of calculating visual odometry for outdoor robots equipped with a stereo rig. Instead of the typical feature matching or tracking, we use an improved stereo-tracking method that simultaneously decides the feature displacement in both cameras. Based on the matched features, a two-point algorithm for the resulting quadrifocal setting is carried out in a RANSAC framework to recover the unknown odometry. In addition, if the change in rotation can be accurately derived from other sensors, e.g. inertial sensors, a one-point algorithm can be used to obtain the remaining translational unknowns. We have implemented both algorithms on an outdoor robot that is used in challenging terrain and present extensive outdoor experiments. The approach is quite robust and deals well with challenging conditions such as wheel slippage.