Extrinsic visual–inertial calibration for motion distortion correction of underwater 3D scans

Share
Underwater 3D laser scanners are an essential type of sensor used by unmanned underwater vehicles (UUVs) for operations such as navigation, inspection, and object recognition and manipulation. Scanners that acquire 3D data by sweeping a laser plane across the scene can provide very high lateral resolution. However, their data may suffer from rolling shutter effect if the change of pose of the robot with respect to the scanned target during the sweep is not negligible. In order to compensate for motion-related distortions without the need for point cloud postprocessing, the 6-DoF pose at which the scanner acquires each line needs to be accurately known. In the underwater domain, autonomous vehicles are often equipped with a high-end inertial navigation system (INS) that provides reliable navigation data. Nonetheless, the relative pose of the 3D scanner with respect to the inertial reference frame of the robot is not usually known a priori. Therefore, this paper uses an ego-motion-based calibration algorithm to calibrate the extrinsic parameters of the visual-inertial sensor pair. Simulations are performed to quantify how miscalibration affects motion-related distortion. The method is also evaluated experimentally in laboratory conditions ​
This document is licensed under a Creative Commons:Attribution - Non commercial - No Derivate Works (by-nc-nd) Creative Commons by-nc-nd4.0