Robust Underwater Visual SLAM Fusing Acoustic Sensing

被引:34
|
作者
Vargas, Elizabeth [1 ]
Scona, Raluca [1 ]
Willners, Jonatan Scharff [1 ]
Luczynski, Tomasz [1 ]
Cao, Yu [2 ]
Wang, Sen [1 ]
Petillot, Yvan R. [1 ]
机构
[1] Heriot Watt Univ, Sch Engn & Phys Sci, Edinburgh, Midlothian, Scotland
[2] Univ Edinburgh, Sch Engn, Edinburgh, Midlothian, Scotland
关键词
D O I
10.1109/ICRA48506.2021.9561537
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose an approach for robust visual Simultaneous Localisation and Mapping (SLAM) in underwater environments leveraging acoustic, inertial and altimeter/depth sensors. Underwater visual SLAM is challenging due to factors including poor visibility caused by suspended particles in water, a lack of light and insufficient texture in the scene. Because of this, many state-of-the-art approaches rely on acoustic sensing instead of vision for underwater navigation. Building on the sparse visual SLAM system ORB-SLAM2, this paper proposes to improve the robustness of camera pose estimation in underwater environments by leveraging acoustic odometry, which derives a drifting estimate of the 6-DoF robot pose from fusion of a Doppler Velocity Log (DVL), a gyroscope and an altimeter or depth sensor. Acoustic odometry estimates are used as motion priors and we formulate pose residuals that are integrated within the camera pose tracking, local and global bundle adjustment procedures of ORB-SLAM2. The original design of ORB-SLAM2 supports a single map and it enters relocalisation when tracking is lost. This is a significant problem for scenarios where a robot does a continuous scanning motion without returning to a previously visited location. One of our main contributions is to enable the system to create a new map whenever it encounters a new scene where visual odometry can work. This new map is connected with its predecessor in a common graph using estimates from the proposed acoustic odometry. Experimental results on two underwater vehicles demonstrate the increased robustness of our approach compared to baseline ORB-SLAM2 in both controlled, uncontrolled and field environments.
引用
收藏
页码:2140 / 2146
页数:7
相关论文
共 50 条
  • [41] A Robust Visual SLAM System in Dynamic Environment
    Ma, Huajun
    Qin, Yijun
    Duan, Shukai
    Wang, Lidan
    ADVANCES IN NEURAL NETWORKS-ISNN 2024, 2024, 14827 : 248 - 257
  • [42] Robust Large Scale Monocular Visual SLAM
    Bourmaud, Guillaume
    Megret, Remi
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 1638 - 1647
  • [43] Underwater Acoustic Localization using pose-graph SLAM
    Real, Marta
    Vial, Pau
    Palomeras, Narcis
    Carreras, Marc
    OCEANS 2023 - LIMERICK, 2023,
  • [44] Fusing acoustic and optical sensing for needle tracking with ultrasound
    Cheng, Alexis
    Zhang, Bofeng
    Oh, Phillip
    Boctor, Emad M.
    MEDICAL IMAGING 2018: IMAGE-GUIDED PROCEDURES, ROBOTIC INTERVENTIONS, AND MODELING, 2018, 10576
  • [45] Navigation of an autonomous underwater vehicle(AUV) using robust SLAM
    West, Michael E.
    Syrmos, Vassilis L.
    PROCEEDINGS OF THE 2006 IEEE INTERNATIONAL CONFERENCE ON CONTROL APPLICATIONS, VOLS 1-4, 2006, : 1125 - 1130
  • [46] An Adaptive Visual Dynamic-SLAM Method Based on Fusing the Semantic Information
    Jiao, Jichao
    Wang, Chenxu
    Li, Ning
    Deng, Zhongliang
    Xu, Wei
    IEEE SENSORS JOURNAL, 2022, 22 (18) : 17414 - 17420
  • [47] An Unsupervised Neural Network for Loop Detection in Underwater Visual SLAM
    Burguera, Antoni
    Bonin-Font, Francisco
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2020, 100 (3-4) : 1157 - 1177
  • [48] Robust Visual Odometry in Underwater Environment
    Zhang, Jun
    Ila, Viorela
    Kneip, Laurent
    2018 OCEANS - MTS/IEEE KOBE TECHNO-OCEANS (OTO), 2018,
  • [49] An Unsupervised Neural Network for Loop Detection in Underwater Visual SLAM
    Antoni Burguera
    Francisco Bonin-Font
    Journal of Intelligent & Robotic Systems, 2020, 100 : 1157 - 1177
  • [50] Visual SLAM with Keyframe Selection for Underwater Structure Inspection using an Autonomous Underwater Vehicle
    Hong, Seonghun
    Kim, Jinwhan
    2016 13TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2016, : 558 - 562