Vertical edge-based mapping using range-augmented omnidirectional vision sensor

Texto Completo
Vertical-edge-based-mapping.pdf embargoed access
Solicita copia
Al rellenar este formulario estáis solicitando una copia del artículo, depositado en el repositorio institucional (DUGiDocs), a su autor o al autor principal del artículo. Será el mismo autor quien decidirá enviar una copia del documento a quien lo solicite si lo considera oportuno. En todo caso, la Biblioteca de la UdG no interviene en este proceso ya que no está autorizada a facilitar artículos cuando éstos son de acceso restringido.
Compartir
Laser range finder and omnidirectional cameras are becoming a promising combination of sensors to extract rich environmental information. This information includes textured plane extraction, vanishing points, catadioptric projection of vertical and horizontal lines, or invariant image features. However, many indoor scenes do not have enough texture information to describe the environment. In these situations, vertical edges could be used instead. This study presents a sensor model that is able to extract three-dimensional position of vertical edges from a range-augmented omnidirectional vision sensor. Using the unified spherical model for central catadioptric sensors and the proposed sensor model, the vertical edges are locally projected, improving the data association for mapping and localisation. The proposed sensor model was tested using the FastSLAM algorithm to solve the simultaneous localisation and mapping problem in indoor environments. Real-world qualitative and quantitative experiments are presented to validate the proposed approach using a Pioneer-3DX mobile robot equipped with a URG-04LX laser range finder and an omnidirectional camera with parabolic mirror ​
​Tots els drets reservats