Jump to : Download | Abstract | Contact | BibTex reference | EndNote reference |


A.I. Comport, D. Kragic, E. Marchand, F. Chaumette. Robust real-time visual tracking: Comparison, theoretical analysis and performance evaluation. In IEEE Int. Conf. on Robotics and Automation, ICRA'05, Pages 2852-2857, Barcelona, Spain, April 2005.

Download [help]

Download paper: Doi page

Download Hal paper: Hal : Hyper Archive en ligne

Download paper: Adobe portable document (pdf) pdf

Copyright notice:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder. This page is automatically generated by bib2html v217, © Inria 2002-2024, Projet Lagadic/Rainbow


In this paper, two real-time pose tracking algorithms of rigid objects are compared. Both methods are 3D-model based and are capable of calculating the pose between the camera and an object with a monocular vision system. Here, special consideration has been put on defining and evaluating different performance criteria such as computational efficiency, accuracy and robustness. Both methods are described and a unifying framework is derived. The main advantage of both algorithms lie in their real-time capabilities (on standard hardware) whilst being robust to miss-tracking, occlusion and changes in illumination


Eric Marchand
Francois Chaumette

BibTex Reference

   Author = {Comport, A.I. and Kragic, D. and Marchand, E. and Chaumette, F.},
   Title = {Robust real-time visual tracking: Comparison, theoretical analysis and performance evaluation},
   BookTitle = {IEEE Int. Conf. on Robotics and Automation, ICRA'05},
   Pages = {2852--2857},
   Address = {Barcelona, Spain},
   Month = {April},
   Year = {2005}

EndNote Reference [help]

Get EndNote Reference (.ref)