Probabilistic Plane Extraction and Modeling for Active Visual-Inertial Mapping

被引:3
|
作者
Usayiwevu, Mitchell [1 ]
Sukkar, Fouad [1 ,2 ]
Vidal-Calleja, Teresa [1 ,2 ]
机构
[1] Univ Technol Sydney, UTS Robot Inst, Sydney, NSW 2007, Australia
[2] Australian Cobot Ctr, ITTC Collaborat Robot Adv Mfg, Brisbane, Qld, Australia
基金
澳大利亚研究理事会;
关键词
ODOMETRY;
D O I
10.1109/ICRA48891.2023.10160792
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents an active visual-inertial mapping framework with points and planes. The key aspect of the proposed framework is a novel probabilistic plane extraction with its associated model for estimation. The approach allows the extraction of plane parameters and their uncertainties based on a modified version of PlaneRCNN [1]. The extracted probabilistic plane features are fused with point features in order to increase the robustness of the estimation system in texture-less environments, where algorithms based on points alone would struggle. A visual-inertial framework based on Iterative Extended Kalman filter (IEKF) is used to demonstrate the approach. The IEKF equations are customized through a measurement extrapolation method, which enables the estimation to handle the delay introduced by the neural network inference time systematically. The system is encompassed within an active mapping framework, based on Informative Path Planning to find the most informative path for minimizing map uncertainty in visual-inertial systems. The results from the conducted experiments with a stereo/IMU system mounted on a robotic arm show that introducing planar features to the map, in order to complement the point features in the state estimation, improves robustness in texture-less environments.
引用
收藏
页码:10601 / 10607
页数:7
相关论文
共 50 条
  • [41] A novel visual-inertial Monocular SLAM
    Yue, Xiaofeng
    Zhang, Wenjuan
    Xu, Li
    Liu, JiangGuo
    MIPPR 2017: AUTOMATIC TARGET RECOGNITION AND NAVIGATION, 2018, 10608
  • [42] Visual-Inertial Telepresence for Aerial Manipulation
    Lee, Jongseok
    Balachandran, Ribin
    Sarkisov, Yuri S.
    De Stefano, Marco
    Coelho, Andre
    Shinde, Kashmira
    Kim, Min Jun
    Triebel, Rudolph
    Kondak, Konstantin
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 1222 - 1229
  • [43] Information Sparsification in Visual-Inertial Odometry
    Hsiung, Jerry
    Hsiao, Ming
    Westman, Eric
    Valencia, Rafael
    Kaess, Michael
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 1146 - 1153
  • [44] Visual-Inertial Navigation with Guaranteed Convergence
    Di Corato, Francesco
    Innocenti, Mario
    Pollini, Lorenzo
    2013 IEEE WORKSHOP ON ROBOT VISION (WORV), 2013, : 152 - 157
  • [45] A nonlinear optimization-based monocular dense mapping system of visual-inertial odometry
    Fan, Chuanliu
    Hou, Junyi
    Yu, Lei
    MEASUREMENT, 2021, 180
  • [46] Visual-inertial fusion positioning and mapping method based on point-line features
    Feng, Qinghua
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2022, 70 (02) : 113 - 119
  • [47] Visual-Inertial Navigation System Based on Virtual Inertial Sensors
    Cai, Yunpiao
    Qian, Weixing
    Zhao, Jiaqi
    Dong, Jiayi
    Shen, Tianxiao
    APPLIED SCIENCES-BASEL, 2023, 13 (12):
  • [48] A Visual-Inertial Pressure Fusion-Based Underwater Simultaneous Localization and Mapping System
    Lu, Zhufei
    Xu, Xing
    Luo, Yihao
    Ding, Lianghui
    Zhou, Chao
    Wang, Jiarong
    SENSORS, 2024, 24 (10)
  • [49] Visual-Inertial Simultaneous Localization, Mapping and Sensor-to-Sensor Self-Calibration
    Kelly, Jonathan
    Sukhatme, Gaurav S.
    IEEE INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN ROBOTICS AND AUTOMATION, 2009, : 360 - 368
  • [50] Multi-camera Visual-Inertial Simultaneous Localization and Mapping for Autonomous Valet Parking
    Abate, Marcus
    Schwartz, Ariel
    Wong, Xue Iuan
    Luo, Wangdong
    Littman, Rotem
    Klinger, Marc
    Kuhnert, Lars
    Blue, Douglas
    Carlone, Luca
    EXPERIMENTAL ROBOTICS, ISER 2023, 2024, 30 : 567 - 581