Multimedia  

 

Volume 20 Issue 07 - Publication Date: 1 July 2001
 
Edge-Projected Integration of Image and Model Cues for Robust Model-Based Object Tracking
 
M. Vincze, M. Ayromlou, W. Ponweiser and M. Zillich Vienna University of Technology
 
A real-world limitation of visual servoing approaches is the sensitivity of visual tracking to varying ambient conditions and background clutter. The authors present a model-based vision framework to improve the robustness of edge-based feature tracking. Lines and ellipses are tracked using edge-projected integration of cues (EPIC). EPIC uses cues in regions delineated by edges that are defined by observed edgels and a priori knowledge from a wire-frame model of the object. The edgels are then used for a robust fit of the feature geometry, but at times this results in multiple feature candidates. A final validation step uses the model topology to select the most likely feature candidates. EPIC is suited for real-time operation. Experiments demonstrate operation at frame rate. Navigating a walking robot through an industrial environment shows the robustness to varying lighting conditions. Tracking objects over varying backgrounds indicates robustness to clutter.
 
Multimedia Key
= Video = Data = Code = Image
 
Extension
Media Type
Description
1
Tracking a lamp shade using just gradient information (2.6MB)
2
Tracking of the lamp shade using EPIC. The tracker lines are placed along the visible contour detected in the previous tracking cycle. At both ends one tracker line is placed beyond the end to enable adaptation to changes in occlusion or orientation of the ellipse arc (2.1MB)
3
Samples of tracking an ellipse arc (2.5MB)
4
Tracking the base of a monitor. The pose estimated from the lines and the ellipse is re-projected into the image. Tracking results are displayed in yellow (3.1MB)
5
Tracking the mockup from a walking robot (8.5MB)
6
Examples of tracking from a walking robot in a part of the mockup with only 5 visible edges (4.4MB)
7
Tracking in an office environment from a walking robot. The pose estimated from the lines is re-projected into the image. Tracking results are displayed in yellow (1.1MB)
 
Return to Contents