Underwater Visual Acoustic SLAM with Extrinsic Calibration

Shida Xu, Tomasz Luczynski, Jonatan Scharff Willners, Ziyang Hong, Kaicheng Zhang, Yvan R. Petillot, Sen Wang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

17 Citations (Scopus)

Abstract

Underwater scenarios are challenging for visual Simultaneous Localization and Mapping (SLAM) due to limited visibility and intermittently losing structures in image views. In this paper, we propose a visual acoustic bundle adjustment system which fuses a camera and a Doppler Velocity Log (DVL) in a graph SLAM framework for reliable underwater localization and mapping. In order to fuse the vision with the acoustic measurements, an calibration algorithm is also designed to estimate extrinsic parameters between a camera and a DVL using features detected in scenes. Experimental results in a tank and an offshore wind farm show the proposed method can achieve better robustness and localization accuracy than pure visual SLAM, especially in visually challenging scenarios, and the extrinsic calibration parameters can be accurately estimated, even when initialized with a random guess.

Original languageEnglish
Title of host publicationIEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
PublisherIEEE
Pages7647-7652
Number of pages6
ISBN (Electronic)9781665417143
DOIs
Publication statusPublished - 16 Dec 2021
Event2021 IEEE/RSJ International Conference on Intelligent Robots and Systems - Prague, Czech Republic
Duration: 27 Sept 20211 Oct 2021

Conference

Conference2021 IEEE/RSJ International Conference on Intelligent Robots and Systems
Abbreviated titleIROS 2021
Country/TerritoryCzech Republic
CityPrague
Period27/09/211/10/21

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Underwater Visual Acoustic SLAM with Extrinsic Calibration'. Together they form a unique fingerprint.

Cite this