Flandin02a
G. Flandin, F. Chaumette. Visual data fusion for objects localization by active vision. In Eur. Conf. on Computer Vision, ECCV'02, LNCS 2353, Pages 312-326, Copenhagen, Denmark, May 2002.
Download [help]
Download paper: Adobe portable document (pdf)
Copyright notice:
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder. This page is automatically generated by bib2html v217, © Inria 2002-2024, Projet Lagadic/Rainbow
Abstract
Visual sensors provide exclusively uncertain and partial knowledge of a scene. In this article, we present a suitable scene knowl- edge representation that makes integration and fusion of new, uncertain and partial sensor measures possible. It is based on a mixture of stochas- tic and set membership models. We consider that, for a large class of ap- plications, an approximated representation is sufficient to build a prelim- inary map of the scene. Our approximation mainly results in ellipsoidal calculus by means of a normal assumption for stochastic laws and ellip- soidal over or inner bounding for uniform laws. These approximations allow us to build an efficient estimation process integrating visual data on line. Based on this estimation scheme, optimal exploratory motions of the camera can be automatically determined. Real time experimental results validating our approach are finally given
Contact
BibTex Reference
@InProceedings{Flandin02a,
Author = {Flandin, G. and Chaumette, F.},
Title = {Visual data fusion for objects localization by active vision},
BookTitle = {Eur. Conf. on Computer Vision, ECCV'02, LNCS 2353},
Pages = {312--326},
Address = {Copenhagen, Denmark},
Month = {May},
Year = {2002}
}
EndNote Reference [help]
Get EndNote Reference (.ref)