Two-Dimensional Frontier-Based Viewpoint Generation for Exploring and Mapping Underwater Environments

To autonomously explore complex underwater environments, it is convenient to develop motion planning strategies that do not depend on prior information. In this publication, we present a robotic exploration algorithm for autonomous underwater vehicles (AUVs) that is able to guide the robot so that it explores an unknown 2-dimensional (2D) environment. The algorithm is built upon view planning (VP) and frontier-based (FB) strategies. Traditional robotic exploration algorithms seek full coverage of the scene with data from only one sensor. If data coverage is required for multiple sensors, multiple exploration missions are required. Our approach has been designed to sense the environment achieving full coverage with data from two sensors in a single exploration mission: occupancy data from the profiling sonar, from which the shape of the environment is perceived, and optical data from the camera, to capture the details of the environment. This saves time and mission costs. The algorithm has been designed to be computationally efficient, so that it can run online in the AUV’s onboard computer. In our approach, the environment is represented using a labeled quadtree occupancy map which, at the same time, is used to generate the viewpoints that guide the exploration. We have tested the algorithm in different environments through numerous experiments, which include sea operations using the Sparus II AUV and its sensor suite ​
This document is licensed under a Creative Commons:Attribution (by) Creative Commons by4.0