Event-Based, 6-DOF Camera Tracking from Photometric Depth Maps

被引:128
|
作者
Gallego, Guillermo [1 ,2 ,3 ]
Lund, Jon E. A. [1 ,2 ,3 ]
Mueggler, Elias [1 ,2 ,3 ]
Rebecq, Henri [1 ,2 ,3 ]
Delbruck, Tobi [1 ,2 ,3 ]
Scaramuzza, Davide [1 ,2 ,3 ]
机构
[1] Univ Zurich, Robot & Percept Grp, Dept Informat, CH-8092 Zurich, Switzerland
[2] Univ Zurich, Dept Neuroinformat, CH-8092 Zurich, Switzerland
[3] Swiss Fed Inst Technol, CH-8092 Zurich, Switzerland
关键词
Event-based vision; pose tracking; dynamic vision sensor; Bayes filter; asynchronous processing; conjugate priors; low latency; high speed; AR/VR; VISUAL ODOMETRY; VISION; PIXEL;
D O I
10.1109/TPAMI.2017.2769655
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high-speed motions or in scenes characterized by high dynamic range. These features, along with a very low power consumption, make event cameras an ideal complement to standard cameras for VR/AR and video game applications. With these applications in mind, this paper tackles the problem of accurate, low-latency tracking of an event camera from an existing photometric depth map (i.e., intensity plus depth information) built via classic dense reconstruction pipelines. Our approach tracks the 6-DOF pose of the event camera upon the arrival of each event, thus virtually eliminating latency. We successfully evaluate the method in both indoor and outdoor scenes and show that-because of the technological advantages of the event camera-our pipeline works in scenes characterized by high-speed motion, which are still inaccessible to standard cameras.
引用
收藏
页码:2402 / 2412
页数:11
相关论文
共 50 条
  • [21] Trajectory Tracking of a 6-DOF Industrial Manipulator
    Abubakar, Umar
    Wang Zhongmin
    Gao Ying
    2014 11TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2014, : 58 - 62
  • [22] Normalized Localization for 6-DOF Camera Pose Regression
    Huang, Weiquan
    Bai, Yan
    Wang, Yixin
    Wu, Yutang
    Feng, Ming
    Wang, Yin
    ELEVENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING (ICGIP 2019), 2020, 11373
  • [23] Analytical Review of Event-Based Camera Depth Estimation Methods and Systems
    Furmonas, Justas
    Liobe, John
    Barzdenas, Vaidotas
    SENSORS, 2022, 22 (03)
  • [24] A multi-camera 6-DOF pose tracker
    Tariq, S
    Dellaert, F
    ISMAR 2004: THIRD IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, 2004, : 296 - 297
  • [25] 6-DOF Model Based Tracking via Object Coordinate Regression
    Krull, Alexander
    Michel, Frank
    Brachmann, Eric
    Gumhold, Stefan
    Ihrke, Stephan
    Rother, Carsten
    COMPUTER VISION - ACCV 2014, PT IV, 2015, 9006 : 384 - 399
  • [26] Trajectory generation for a quaternion based 6-DoF ROV tracking controller
    Lack, Sven
    Rentzow, Erik
    Jeinsch, Torsten
    2022 30TH MEDITERRANEAN CONFERENCE ON CONTROL AND AUTOMATION (MED), 2022, : 414 - 419
  • [27] Efficient 6-DoF Tracking of Handheld Objects from an Egocentric Viewpoint
    Pandey, Rohit
    Pidlypenskyi, Pavel
    Yang, Shuoran
    Kaeser-Chen, Christine
    COMPUTER VISION - ECCV 2018, PT II, 2018, 11206 : 426 - 441
  • [28] How to improve CNN-based 6-DoF camera pose estimation
    Seifi, Soroush
    Tuytelaars, Tinne
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 3788 - 3795
  • [29] Miniature 6-DOF inertial system for tracking HMDs
    Foxlin, E
    Harrington, M
    Altshuler, Y
    HELMET- AND HEAD-MOUNTED DISPLAY III, 1998, 3362 : 214 - 228
  • [30] Study about the 6-DOF parallel tracking platform
    Tan Sen
    Ren Ge
    Tan Yi
    XX INTERNATIONAL SYMPOSIUM ON HIGH-POWER LASER SYSTEMS AND APPLICATIONS 2014, 2015, 9255