A visual SLAM method based on point-line fusion in weak-matching scene

被引:17
|
作者
Fang, Baofu [1 ,2 ]
Zhan, Zhiqiang [1 ,2 ]
机构
[1] Hefei Univ Technol, Minist Educ, Key Lab Knowledge Engn Big Data, 193 Tunxi Rd, Hefei 230009, Anhui, Peoples R China
[2] Hefei Univ Technol, Sch Comp Sci & Informat Engn, 193 Tunxi Rd, Hefei 230009, Anhui, Peoples R China
关键词
Mobile robot; simultaneous localization and mapping; point-line fusion; reprojection error; DESCRIPTOR; TRANSFORM;
D O I
10.1177/1729881420904193
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Visual simultaneous localization and mapping (SLAM) is well-known to be one of the research areas in robotics. There are many challenges in traditional point feature-based approaches, such as insufficient point features, motion jitter, and low localization accuracy in low-texture scenes, which reduce the performance of the algorithms. In this article, we propose an RGB-D SLAM system to handle these situations, which is named Point-Line Fusion (PLF)-SLAM. We utilize both points and line segments throughout the process of our work. Specifically, we present a new line segment extraction method to solve the overlap or branch problem of the line segments, and then a more rigorous screening mechanism is proposed in the line matching section. Instead of minimizing the reprojection error of points, we introduce the reprojection error based on points and lines to get a more accurate tracking pose. In addition, we come up with a solution to handle the jitter frame, which greatly improves tracking success rate and availability of the system. We thoroughly evaluate our system on the Technische Universitat Munchen (TUM) RGB-D benchmark and compare it with ORB-SLAM2, presumably the current state-of-the-art solution. The experiments show that our system has better accuracy and robustness compared to the ORB-SLAM2.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] A Point-Line Feature based Visual SLAM Method in Dynamic Indoor Scene
    Wang, Runzhi
    Wang, Yongkang
    Wan, Wenhui
    Di, Kaichang
    PROCEEDINGS OF 5TH IEEE CONFERENCE ON UBIQUITOUS POSITIONING, INDOOR NAVIGATION AND LOCATION-BASED SERVICES (UPINLBS), 2018, : 455 - 460
  • [2] A Review of Visual SLAM Algorithms for Fusion of Point-Line Features
    Qing, Yong
    Yu, Haidong
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND NETWORKS, VOL III, CENET 2023, 2024, 1127 : 61 - 67
  • [3] Robot SLAM algorithm based on visual inertia fusion of point-line features
    Wang L.
    Zhu X.
    Ma D.
    Wang H.
    Zhongguo Guanxing Jishu Xuebao/Journal of Chinese Inertial Technology, 2022, 30 (06): : 730 - 737
  • [4] Improved Point-Line Feature Based Visual SLAM Method for Indoor Scenes
    Wang, Runzhi
    Di, Kaichang
    Wan, Wenhui
    Wang, Yongkang
    SENSORS, 2018, 18 (10)
  • [5] Improved Point-Line Feature Based Visual SLAM Method for Complex Environments
    Zhou, Fei
    Zhang, Limin
    Deng, Chaolong
    Fan, Xinyue
    SENSORS, 2021, 21 (13)
  • [6] A Monocular Visual SLAM Algorithm Based on Point-Line Feature
    Wang D.
    Huang L.
    Li Y.
    Jiqiren/Robot, 2019, 41 (03): : 392 - 403
  • [7] Visual SLAM Algorithm Based on Point-Line Features under RTM Framework
    Jia S.
    Ding M.
    Zhang G.
    Jiqiren/Robot, 2019, 41 (03): : 384 - 391
  • [8] Visual-inertial fusion positioning and mapping method based on point-line features
    Feng, Qinghua
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2022, 70 (02) : 113 - 119
  • [9] PLI-VINS: Visual-Inertial SLAM Based on Point-Line Feature Fusion in Indoor Environment
    Zhao, Zhangzhen
    Song, Tao
    Xing, Bin
    Lei, Yu
    Wang, Ziqin
    SENSORS, 2022, 22 (14)
  • [10] Mobile robot localization method based on point-line feature visual-inertial SLAM algorithm
    Xu, Jintao
    Fang, Yu
    Gao, Weiwei
    Liu, Xintian
    Shi, Juanjuan
    Yang, Hao
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2024,