Servant08a
F. Servant, E. Marchand, P. Houlier, I. Marchal. Visual planes-based simultaneous localization and model refinement for augmented reality. In IAPR, Int. Conf. on Pattern Recognition, ICPR'08, Tampa, Florida, December 2008.
Download [help]
Download paper: Adobe portable document (pdf)
Copyright notice:
 
 
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder. This page is automatically generated by bib2html v217, © Inria 2002-2024, Projet Lagadic/Rainbow
Abstract
This paper presents a method for camera pose tracking that uses a partial knowledge about the scene. The method is based on monocular vision Simultaneous Localization And Mapping (SLAM). With respect to classical SLAM implementations, this approach uses previously known information about the environment (rough map of the walls) and profits from the various available databases and blueprints to constraint the problem. This method considers that the tracked image patches belong to known planes (with some uncertainty in their localization) and that SLAM map can be represented by associations of cameras and planes. In this paper, we propose an adapted SLAM implementation and detail the considered models. We show that this method gives good results for a real sequence with complex motion for augmented reality (AR) application
Contact
BibTex Reference
@InProceedings{Servant08a,
Author = {Servant, F. and Marchand, E. and Houlier, P. and Marchal, I.},
Title = {Visual planes-based simultaneous localization and model refinement for augmented reality},
BookTitle = {IAPR, Int. Conf. on Pattern Recognition, ICPR'08},
Address = {Tampa, Florida},
Month = {December},
Year = {2008}
}
EndNote Reference [help]
Get EndNote Reference (.ref)