A Neuromorphic Vision-Based Measurement for Robust Relative Localization in Future Space Exploration Missions

被引:8
|
作者
Salah, Mohammed [1 ]
Chehadah, Mohammed [1 ]
Humais, Muhammad [1 ]
Wahbah, Mohammed [1 ]
Ayyad, Abdulla [2 ]
Azzam, Rana [1 ]
Seneviratne, Lakmal [1 ]
Zweiri, Yahya [3 ,4 ]
机构
[1] Khalifa Univ, Ctr Autonomous Robot Syst, Abu Dhabi, U Arab Emirates
[2] Khalifa Univ Sci & Technol, Adv Res & Innovat Ctr ARIC, Abu Dhabi, U Arab Emirates
[3] Khalifa Univ, Ctr Autonomous Robot Syst, Abu Dhabi, U Arab Emirates
[4] Khalifa Univ, Dept Aerosp Engn, Abu Dhabi, U Arab Emirates
关键词
Flickering landmarks; Gaussian mixture models (GMMs); landmark tracking Kalman filter (LTKF); neuromorphic vision-based measurements (NVBMs); space robotics; translation decoupled Kalman filter (TDKF);
D O I
10.1109/TIM.2022.3217513
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Space exploration has witnessed revolutionary changes upon landing of the Perseverance Rover on the Martian surface and demonstrating the first flight beyond Earth by the Mars helicopter, Ingenuity. During their mission on Mars, Perseverance Rover and Ingenuity collaboratively explore the Martian surface, where Ingenuity scouts terrain information for rover's safe traversability. Hence, determining the relative poses between both the platforms is of paramount importance for the success of this mission. Driven by this necessity, this work proposes a robust relative localization system based on a fusion of neuromorphic vision-based measurements (NVBMs) and inertial measurements. The emergence of neuromorphic vision triggered a paradigm shift in the computer vision community, due to its unique working principle delineated with asynchronous events triggered by variations of light intensities occurring in the scene. This implies that observations cannot be acquired in static scenes due to illumination invariance. To circumvent this limitation, high-frequency active landmarks are inserted in the scene to guarantee consistent event firing. These landmarks are adopted as salient features to facilitate relative localization. A novel event-based landmark identification algorithm using Gaussian mixture models (GMMs) is developed for matching the landmarks correspondences formulating our NVBMs. The NVBMs are fused with inertial measurements in proposed state estimators, landmark tracking Kalman filter (LTKF), and translation decoupled Kalman filter (TDKF) for landmark tracking and relative localization, respectively. The proposed system was tested in a variety of experiments and has outperformed the state-of-the-art (SOTA) approaches in accuracy and range.
引用
收藏
页码:1 / 12
页数:12
相关论文
共 50 条
  • [21] Robust Vision-Based Relative-Localization Approach Using an RGB-Depth Camera and LiDAR Sensor Fusion
    Song, Haryong
    Choi, Wonsub
    Kim, Haedong
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2016, 63 (06) : 3725 - 3736
  • [22] A Monocular Vision-based System for 6D Relative Robot Localization
    Breitenmoser, Andreas
    Kneip, Laurent
    Siegwart, Roland
    2011 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2011, : 79 - 85
  • [23] Robust Vision-based Pose Estimation for Relative Navigation of Unmanned Aerial Vehicles
    Park, Jang-Seong
    Lee, Dongjin
    Jeon, Byoungil
    Bang, Hyochoong
    2013 13TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2013), 2013, : 386 - 390
  • [24] Vision-based force measurement
    Greminger, MA
    Nelson, BJ
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2004, 26 (03) : 290 - 298
  • [25] Robust vision-based localization by combining an image-retrieval system with Monte Carlo localization
    Wolf, J
    Burgard, W
    Burkhardt, H
    IEEE TRANSACTIONS ON ROBOTICS, 2005, 21 (02) : 208 - 216
  • [26] Localization Using Vision-Based Robot
    Yun, Yeol-Min
    Yu, Ho-Yun
    Lee, Jang-Myung
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2014, PT II, 2014, 8918 : 285 - 289
  • [27] Vision-based localization for mobile platforms
    Porta, JM
    Kröse, BJA
    AMBIENT INTELLIGENCE, PROCEEDINGS, 2003, 2875 : 208 - 219
  • [28] Vision-based Localization and Positioning of an AUV
    Figueiredo, Andre B.
    Ferreira, Bruno M.
    Matos, Anibal C.
    OCEANS 2016 - SHANGHAI, 2016,
  • [29] Vision-based localization for mobile robots
    Adorni, G
    Cagnoni, S
    Enderle, S
    Kraetzschmar, GK
    Mordonini, M
    Plagge, M
    Ritter, M
    Sablatnög, S
    Zell, A
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2001, 36 (02) : 103 - 119
  • [30] Vision-based localization in urban environments
    McHenry, M
    Cheng, Y
    Matthies, L
    UNMANNED GROUND VEHICLE TECHNOLOGY VII, 2005, 5804 : 359 - 370