DI-EME: Deep Inertial Ego-Motion Estimation for Autonomous Underwater Vehicle

被引:0
|
作者
Li, Ziyuan [1 ,2 ]
Yu, Huapeng [3 ]
Yang, Wentie [1 ,2 ]
Zhang, Yanmin [1 ,2 ]
Li, Ye [4 ]
Xiao, Hanchen [1 ,2 ]
机构
[1] Hubei Key Lab Marine Electromagnet Detect & Contro, Wuhan 430064, Hubei, Peoples R China
[2] Wuhan Second Ship Design & Res Inst, Wuhan 430064, Hubei, Peoples R China
[3] Natl Innovat Inst Def Technol, Beijing 100071, Peoples R China
[4] Harbin Engn Univ, Sci & Technol Underwater Vehicle Lab, Harbin 150001, Peoples R China
基金
中国国家自然科学基金;
关键词
Navigation; Estimation; Sensors; Underwater navigation; Inertial navigation; Deep learning; Underwater vehicles; ego-motion estimation; inertial measurement units (IMUs); underwater navigation; NAVIGATION SYSTEM; AUV NAVIGATION; CALIBRATION; ORIENTATION;
D O I
10.1109/JSEN.2024.3386354
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Inertial navigation systems (INSs) are a topical solution in underwater navigation. Although appealing due to their ability to estimate pose without external information, INS suffer from compounding position errors due to bias and random noise. In general, INSs require the assistance of other positioning devices to achieve satisfactory positioning results. To solve these problems, this article proposes an ego-motion estimation framework with an inertial measurement unit (IMU) and magnetic compass based on the deep learning theory. The main idea is to estimate the displacement of vehicles from the IMU data in the time window and combine this with magnetic compass headings to reconstruct the trajectories of the vehicles. The preintegration technology is used to process raw IMU data, which mathematically separates the dependence of traditional inertial algorithms based on the initial value. Then, convolutional neural networks (CNNs) and attention hybrid networks are used to estimate the displacement of vehicles. In addition, the framework leverages the backpropagation neural network (BPNN) to fuse the magnetic heading and IMU measurements to obtain an accurate heading. Compared with other deep learning methods, the proposed method reduces computational complexity and improves position accuracy. Eventually, the accuracy of the proposed method is verified in the sea trail. The results show that the maximum value of absolute trajectory errors accounts for 12.8% of the distance in severe sea conditions and 6.38% in usual sea conditions.
引用
收藏
页码:18511 / 18519
页数:9
相关论文
共 50 条
  • [31] Inertial sensed ego-motion for 3D vision
    Lobo, J
    Dias, J
    JOURNAL OF ROBOTIC SYSTEMS, 2004, 21 (01): : 3 - 12
  • [32] Correspondenceless ego-motion estimation using an IMU
    Makadia, A
    Daniilidis, K
    2005 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), VOLS 1-4, 2005, : 3534 - 3539
  • [33] A new Method on Camera Ego-motion Estimation
    Yuan, Ding
    Yu, Yalong
    2013 6TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING (CISP), VOLS 1-3, 2013, : 651 - 656
  • [34] Estimating Ego-Motion in Panoramic Image Sequences with Inertial Measurements
    Schill, Felix
    Mahony, Robert
    Corke, Peter
    ROBOTICS RESEARCH, 2011, 70 : 87 - +
  • [35] Radar-Based Ego-Motion Estimation of Autonomous Robot for Simultaneous Localization and Mapping
    Lim, Sohee
    Jung, Jaehoon
    Kim, Seong-Cheol
    Lee, Seongwook
    IEEE SENSORS JOURNAL, 2021, 21 (19) : 21791 - 21797
  • [36] Observability-Aware Self-Calibration of Visual and Inertial Sensors for Ego-Motion Estimation
    Schneider, Thomas
    Li, Mingyang
    Cadena, Cesar
    Nieto, Juan
    Siegwart, Roland
    IEEE SENSORS JOURNAL, 2019, 19 (10) : 3846 - 3860
  • [37] Direct Visual-Inertial Ego-Motion Estimation Via Iterated Extended Kalman Filter
    Zhong, Shangkun
    Chirarattananon, Pakpong
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (02) : 1476 - 1483
  • [38] Fusing visual and inertial sensing to recover robot ego-motion
    Alenyà, G
    Martínez, E
    Torras, C
    JOURNAL OF ROBOTIC SYSTEMS, 2004, 21 (01): : 23 - 32
  • [39] A One-Step Visual-Inertial Ego-Motion Estimation Using Photometric Feedback
    Tan, Shixin
    Zhong, Shangkun
    Chirarattananon, Pakpong
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2022, 27 (01) : 12 - 23
  • [40] Inertial sensed ego-motion for 3D vision
    Lobo, J
    Dias, J
    PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS 2003, VOL 1-3, 2003, : 1907 - 1914