TPL-SLAM: Real-Time Monocular Thermal-Inertial SLAM With Point-Line Tracked by Optical Flow

被引:0
|
作者
Lai, Luguang [1 ]
Li, Linyang [2 ]
Zhou, Yuxuan [2 ]
Zhang, Letian [1 ]
Zhao, Dongqing [1 ]
机构
[1] Informat Engn Univ, Dept Geospatial Informat, Zhengzhou 450001, Peoples R China
[2] Wuhan Univ, Sch Geodesy & Geomatics, Wuhan 430079, Peoples R China
基金
中国国家自然科学基金;
关键词
Simultaneous localization and mapping; Feature extraction; Cameras; Visualization; Optical flow; Real-time systems; Odometry; Noise; Computational efficiency; Lighting; Long-wave infrared (LWIR); monocular thermal-inertial odometry (TIO); point-line combination; simultaneous localization and mapping (SLAM); subterranean space; VISUAL SLAM; ROBUST;
D O I
10.1109/JSEN.2025.3537164
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Visual simultaneous localization and mapping (SLAM) systems perform poorly or even fail to work under extreme conditions such as insufficient light, smoke, and fog, and the infrared camera has stronger anti-interference ability in these challenging scenes. However, the high noise and poor imaging quality of infrared camera severely affect the performance of infrared SLAM. Considering the imaging characteristics of the infrared camera and the weak-texture features of subterranean structured scenes, a point-line combined thermal-inertial SLAM system (TPL-SLAM) is proposed. To improve the computational efficiency of point-line combined SLAM, a superior ELSED algorithm is employed to extract line features. Meanwhile, a 3 degrees-of-freedom (DOF) line feature optical flow tracking algorithm is proposed to track line features between continuous frames. Then, the back-end module optimizes inertial measurement unit (IMU), point, and line feature factors in real-time based on a sliding window and jointly performs loop detection with the point and line features on keyframes. Extensive experiments were conducted on real-world datasets to validate the effectiveness of TPL-SLAM. The results showed that TPL-SLAM outperformed the current advanced monocular visual-inertial system (VINS). Besides, parallel loop detection with point-line features can effectively reduce the risk of false loops. The computational efficiency of the proposed line feature extraction and tracking module is superior to those of PL-VINS and EPLF-VINS and can meet the requirements of real-time operation. The data and code for line feature processing are accessible at https://github.com/Fireflyatcode/TPL_SLAM.
引用
收藏
页码:10015 / 10029
页数:15
相关论文
共 50 条
  • [41] Mobile robot localization method based on point-line feature visual-inertial SLAM algorithm
    Xu, Jintao
    Fang, Yu
    Gao, Weiwei
    Liu, Xintian
    Shi, Juanjuan
    Yang, Hao
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2024,
  • [42] Real-Time Monocular Object-Model Aware Sparse SLAM
    Hosseinzadeh, Mehdi
    Li, Kejie
    Latif, Yasir
    Reid, Ian
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 7123 - 7129
  • [43] Real-Time Visual-Inertial Odometry Based on Point-Line Feature Fusion
    Yang G.
    Meng W.D.
    Hou G.D.
    Feng N.N.
    Gyroscopy and Navigation, 2023, 14 (4) : 339 - 352
  • [44] PL-ISLAM: an Accurate Monocular Visual-Inertial SLAM with Point and Line Features
    Wang, Haobo
    Guan, Lianwu
    Yu, Xilin
    Zhang, Zibin
    PROCEEDINGS OF 2022 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION (IEEE ICMA 2022), 2022, : 1141 - 1146
  • [45] Orbeez-SLAM: A Real-time Monocular Visual SLAM with ORB Features and NeRF-realized Mapping
    Chung, Chi-Ming
    Tseng, Yang-Che
    Hsu, Ya-Ching
    Shi, Xiang-Qian
    Hua, Yun-Hung
    Yeh, Jia-Fong
    Chen, Wen-Chin
    Chen, Yi-Ting
    Hsu, Winston H.
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 9400 - 9406
  • [46] RDMO-SLAM: Real-Time Visual SLAM for Dynamic Environments Using Semantic Label Prediction With Optical Flow
    Liu, Yubao
    Miura, Jun
    IEEE ACCESS, 2021, 9 : 106981 - 106997
  • [47] Improving the real-time efficiency of inertial SLAM and understanding its observability
    Kim, J. (jhkim@acfr.usyd.edu.au), 2004, Institute of Electrical and Electronics Engineers, IEEE; Robotics Society of Japan, RSJ (Institute of Electrical and Electronics Engineers Inc.):
  • [48] Real-Time Performance Test of an Vision-based Inertial SLAM
    Yun, Sukchang
    Lee, Byoung-Jin
    Lee, Young Jae
    Sung, Sangkyung
    INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2010), 2010, : 2423 - 2426
  • [49] VIS-SLAM: A Real-Time Dynamic SLAM Algorithm Based on the Fusion of Visual, Inertial, and Semantic Information
    Wang, Yinglong
    Liu, Xiaoxiong
    Zhao, Minkun
    Xu, Xinlong
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2024, 13 (05)
  • [50] Fast Point Cloud Feature Extraction for Real-time SLAM
    Lee, Sheng-Wei
    Hsu, Chih-Ming
    Lee, Ming-Che
    Fu, Yuan-Ting
    Atas, Fetullah
    Tsai, Augustine
    2019 INTERNATIONAL AUTOMATIC CONTROL CONFERENCE (CACS), 2019,