VIPose: Real-time Visual-Inertial 6D Object Pose Tracking

被引:3
|
作者
Ge, Rundong [1 ]
Loianno, Giuseppe [1 ]
机构
[1] NYU, Tandon Sch Engn, Brooklyn, NY 11201 USA
关键词
D O I
10.1109/IROS51168.2021.9636283
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Estimating the 6D pose of objects is beneficial for robotics tasks such as transportation, autonomous navigation, manipulation as well as in scenarios beyond robotics like virtual and augmented reality. With respect to single image pose estimation, pose tracking takes into account the temporal information across multiple frames to overcome possible detection inconsistencies and to improve the pose estimation efficiency. In this work, we introduce a novel Deep Neural Network (DNN) called VIPose, that combines inertial and camera data to address the object pose tracking problem in real-time. The key contribution is the design of a novel DNN architecture which fuses visual and inertial features to predict the objects' relative 6D pose between consecutive image frames. The overall 6D pose is then estimated by consecutively combining relative poses. Our approach shows remarkable pose estimation results for heavily occluded objects that are well known to be very challenging to handle by existing state-of-the-art solutions. The effectiveness of the proposed approach is validated on a new dataset called VIYCB with RGB image, IMU data, and accurate 6D pose annotations created by employing an automated labeling technique. The approach presents accuracy performances comparable to state-of-the-art techniques, but with the additional benefit of being real-time.
引用
收藏
页码:4597 / 4603
页数:7
相关论文
共 50 条
  • [31] A Real-Time Visual-Inertial Monocular Odometry by Fusing Point and Line Features
    Li, Chengwei
    Yan, Liping
    Xia, Yuanqing
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 4085 - 4090
  • [32] Using the Marginalised Particle Filter for Real-Time Visual-Inertial Sensor Fusion
    Bleser, Gabriele
    Stricker, Didier
    7TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY 2008, PROCEEDINGS, 2008, : 3 - 12
  • [33] DynaVIO: Real-Time Visual-Inertial Odometry with Instance Segmentation in Dynamic Environments
    Zheng, Feixiang
    Lin, Wanbiao
    Sun, Lei
    2024 4TH INTERNATIONAL CONFERENCE ON COMPUTER, CONTROL AND ROBOTICS, ICCCR 2024, 2024, : 21 - 25
  • [34] Visual-inertial navigation, mapping and localization: A scalable real-time causal approach
    Jones, Eagle S.
    Soatto, Stefano
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2011, 30 (04): : 407 - 430
  • [35] A Visual-Inertial Servoing Method for Tracking Object with Two Landmarks and an Inertial Measurement Unit
    Nguyen, Ho-Quoc-Phuong
    Kang, Hee-Jun
    Suh, Young-Soo
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2011, 9 (02) : 317 - 327
  • [36] A real-time visual-inertial mapping and localization method by fusing unstable GPS
    Zhang, Zhongyuan
    Wang, Hesheng
    Chen, Weidong
    2018 13TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2018, : 1397 - 1402
  • [37] A Visual-Inertial Navigation System Using AprilTag for Real-Time MAV Applications
    Barbosa, Joao Paulo de Almeida
    Dias, Stiven Schwanz
    dos Santos, Davi Antonio
    PROCEEDINGS OF THE 2018 25TH INTERNATIONAL CONFERENCE ON MECHATRONICS AND MACHINE VISION IN PRACTICE (M2VIP), 2018, : 191 - 197
  • [38] A Real-Time Sliding-Window-Based Visual-Inertial Odometry for MAVs
    Xiao, Junhao
    Xiong, Dan
    Yu, Qinghua
    Huang, Kaihong
    Lu, Huimin
    Zeng, Zhiwen
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2020, 16 (06) : 4049 - 4058
  • [39] A visual-inertial servoing method for tracking object with two landmarks and an inertial measurement unit
    Ho-Quoc-Phuong Nguyen
    Hee-Jun Kang
    Young-Soo Suh
    International Journal of Control, Automation and Systems, 2011, 9 : 317 - 327
  • [40] Schmidt-EKF-based Visual-Inertial Moving Object Tracking
    Eckenhoff, Kevin
    Geneva, Patrick
    Merrill, Nathaniel
    Huang, Guoquan
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 651 - 657