Underwater navigation and mapping with an omnidirecional optical sensor

Omnidirectional vision has received increasing interest during the last decade from the computer vision community. However, the use of omnidirectional cameras underwater is still very limited. In this thesis we propose several methods to create a reference resource for designing, calibrating and using underwater omnidirectional multi-camera systems (OMS). The first problem we address is their design and calibration. Next, we study stitching strategies to generate omnidirectional panoramas from individual images. Finally, we focus on potential underwater applications. We first explore the promising uses of omnidirectional cameras to create immersive virtual experiences and secondly, we demonstrate the capabilities of omnidirectional cameras as complementary sensors for the navigation of underwater robots. To validate all presented algorithms, two custom omnidirectional cameras were built and several experiments with divers and underwater robots have been carried out to collect the necessary data. ​
​L'accés als continguts d'aquesta tesi queda condicionat a l'acceptació de les condicions d'ús establertes per la següent llicència Creative Commons: http://creativecommons.org/licenses/by/4.0/