VIPose: Real-time Visual-Inertial 6D Object Pose Tracking

被引:3
|
作者
Ge, Rundong [1 ]
Loianno, Giuseppe [1 ]
机构
[1] NYU, Tandon Sch Engn, Brooklyn, NY 11201 USA
关键词
D O I
10.1109/IROS51168.2021.9636283
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Estimating the 6D pose of objects is beneficial for robotics tasks such as transportation, autonomous navigation, manipulation as well as in scenarios beyond robotics like virtual and augmented reality. With respect to single image pose estimation, pose tracking takes into account the temporal information across multiple frames to overcome possible detection inconsistencies and to improve the pose estimation efficiency. In this work, we introduce a novel Deep Neural Network (DNN) called VIPose, that combines inertial and camera data to address the object pose tracking problem in real-time. The key contribution is the design of a novel DNN architecture which fuses visual and inertial features to predict the objects' relative 6D pose between consecutive image frames. The overall 6D pose is then estimated by consecutively combining relative poses. Our approach shows remarkable pose estimation results for heavily occluded objects that are well known to be very challenging to handle by existing state-of-the-art solutions. The effectiveness of the proposed approach is validated on a new dataset called VIYCB with RGB image, IMU data, and accurate 6D pose annotations created by employing an automated labeling technique. The approach presents accuracy performances comparable to state-of-the-art techniques, but with the additional benefit of being real-time.
引用
收藏
页码:4597 / 4603
页数:7
相关论文
共 50 条
  • [21] Large-scale, real-time visual-inertial localization revisited
    Lynen, Simon
    Zeisl, Bernhard
    Aiger, Dror
    Bosse, Michael
    Hesch, Joel
    Pollefeys, Marc
    Siegwart, Roland
    Sattler, Torsten
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2020, 39 (09): : 1061 - 1084
  • [22] Object-Based Visual-Inertial Tracking for Additive Fabrication
    Sandy T.
    Buchli J.
    IEEE Robotics and Automation Letters, 2018, 3 (03) : 1370 - 1377
  • [23] An Improved Approach to 6D Object Pose Tracking in Fast Motion Scenarios
    Wu, Yanming
    Vandewalle, Patrick
    Slaets, Peter
    Demeester, Eric
    2022 SIXTH IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING, IRC, 2022, : 229 - 237
  • [24] On Evaluation of 6D Object Pose Estimation
    Hodan, Tomas
    Matas, Jiri
    Obdrzalek, Stephan
    COMPUTER VISION - ECCV 2016 WORKSHOPS, PT III, 2016, 9915 : 606 - 619
  • [25] 6D Object Pose Tracking with Optical Flow Network for Robotic Manipulation
    Chen, Tao
    Gu, Dongbing
    IFAC PAPERSONLINE, 2023, 56 (02): : 8048 - 8053
  • [26] Explaining the Ambiguity of Object Detection and 6D Pose From Visual Data
    Manhardt, Fabian
    Arroyo, Diego Martin
    Rupprecht, Christian
    Busam, Benjamin
    Birdal, Tolga
    Navab, Nassir
    Tombari, Federico
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 6840 - 6849
  • [27] Real-time position and pose tracking method of moving object using visual servo system
    Takio, A
    Kondo, K
    Kobashi, S
    Hata, Y
    2004 47TH MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL III, CONFERENCE PROCEEDINGS, 2004, : 167 - 170
  • [28] Robust Real-time 6D Active Visual Localization for Humanoid Robots
    Gonzalez-Aguirre, David
    Vollert, Michael
    Asfour, Tamim
    Dillmann, Rudiger
    2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2014, : 2785 - 2791
  • [29] EGLT-SLAM: Real-Time Visual-Inertial SLAM Based on Entropy-Guided Line Tracking
    Jia, Xiang
    Ning, Yipeng
    Chai, Dashuai
    Fan, Jinlong
    Yang, Zhen
    Xi, Xiaoming
    Zhu, Feng
    Wang, Weiwei
    IEEE SENSORS JOURNAL, 2024, 24 (20) : 32757 - 32771
  • [30] Real-time location estimation for indoor navigation using a visual-inertial sensor
    Wang, Zhe
    Li, Xisheng
    Zhang, Xiaojuan
    Bai, Yanru
    Zheng, Chengcai
    SENSOR REVIEW, 2020, 40 (04) : 455 - 464