Volume 26 Issue 6 - Publication Date: 1 June 2007
Simultaneous Motion and Structure Estimation by Fusion of Inertial and Vision Data
P. Gemeiner, P. Einramhof, and M. Vincze Automation and Control Institute Vienna University of Technology Gusshausstrasse 27-29/376, 1040 Vienna, Austria
For mobile robotics, head gear in augmented reality (AR) applications or computer vision, it is essential to continuously estimate the egomotion and the structure of the environment. This paper presents the system developed in the SmartTracking project, which simultaneously integrates visual and inertial sensors in a combined estimation scheme. The sparse structure estimation is based on the detection of corner features in the environment. From a single known starting position, the system can move into an unknown environment. The vision and inertial data are fused, and the performance of both Unscented Kalman filter and Extended Kalman filter are compared for this task. The filters are designed to handle asynchronous input from visual and inertial sensors, which typically operate at different and possibly varying rates. Additionally, a bank of Extended Kalman filters, one per corner feature, is used to estimate the position and the quality of structure points and to include them into the structure estimation process. The system is demonstrated on a mobile robot executing known motions, such that the estimation of the egomotion in an unknown environment can be compared to ground truth.
Multimedia Key
= Video = Data = Code = Image
Example 1: Video of pre-defined motion (1st row
in Tab. 1).
Example 2: Video of pre-defined motion (3rd row
in Tab. 1).
Return to Contents