Vertical edge-based mapping using range-augmented omnidirectional vision sensor

Text Complet
Vertical-edge-based-mapping.pdf closed access
Sol·licita còpia a l'autor de l'article
En omplir aquest formulari esteu demanant una còpia de l'article dipositat al repositori institucional (DUGiDocs) al seu autor o a l'autor principal de l'article. Serà el mateix autor qui decideixi lliurar una còpia del document a qui ho sol•liciti si ho creu convenient. En tot cas, la Biblioteca de la UdG no intervé en aquest procés ja que no està autoritzada a facilitar articles quan aquests són d'accés restringit.
Compartir
Laser range finder and omnidirectional cameras are becoming a promising combination of sensors to extract rich environmental information. This information includes textured plane extraction, vanishing points, catadioptric projection of vertical and horizontal lines, or invariant image features. However, many indoor scenes do not have enough texture information to describe the environment. In these situations, vertical edges could be used instead. This study presents a sensor model that is able to extract three-dimensional position of vertical edges from a range-augmented omnidirectional vision sensor. Using the unified spherical model for central catadioptric sensors and the proposed sensor model, the vertical edges are locally projected, improving the data association for mapping and localisation. The proposed sensor model was tested using the FastSLAM algorithm to solve the simultaneous localisation and mapping problem in indoor environments. Real-world qualitative and quantitative experiments are presented to validate the proposed approach using a Pioneer-3DX mobile robot equipped with a URG-04LX laser range finder and an omnidirectional camera with parabolic mirror ​
​Tots els drets reservats