Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments

被引:20
|
作者
Khattak, Shehryar [1 ]
Papachristos, Christos [1 ]
Alexis, Kostas [1 ]
机构
[1] Univ Nevada, Autonomous Robots Lab, Reno, NV 89557 USA
关键词
ODOMETRY;
D O I
10.1109/aero.2019.8741787
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
During the past decade, aerial robots have seen an unprecedented expansion in their utility as they take on more tasks which had typically been reserved for humans. With an ever widening domain of aerial robotic applications, including many mission critical tasks such as disaster response operations, search and rescue missions and infrastructure inspections taking place in GPS-denied environments, the need for reliable autonomous operation of aerial robots has become crucial. To accomplish their tasks, aerial robots operating in GPS-denied areas rely on a multitude of sensors to localize and navigate. Visible spectrum camera systems correspond to the most commonly used sensing modality due to their low cost and weight rendering them suitable for small aerial robots in indoor or broadly GPS-denied settings. However, in environments that are visually-degraded such as in conditions of poor illumination, low texture, or presence of obscurants including fog, smoke and dust, the reliability of visible light cameras deteriorates significantly. Nevertheless, maintaining reliable robot navigation in such conditions is essential if the robot is to perform many of the critical applications listed above. In contrast to visible light cameras, thermal cameras offer visibility in the infrared spectrum and can be used in a complementary manner with visible spectrum cameras for robot localization and navigation tasks, without paying the significant weight and power penalty typically associated with carrying other sensors such as 3D LiDARs or a RADAR. Exploiting this fact, in this work we present a multi-sensor fusion algorithm for reliable odometry estimation in GPS-denied and degraded visual environments. The proposed method utilizes information from both the visible and thermal spectra for landmark selection and prioritizes feature extraction from informative image regions based on a metric over spatial entropy. Furthermore, inertial sensing cues are integrated to improve the robustness of the odometry estimation process. The proposed method works in real-time, fully on-board an aerial robot. To verify our solution, a set of challenging experiments were conducted inside a) an obscurant-filed machine shop-like industrial environment, as well as b) a dark subterranean mine in the presence of heavy airborne dust.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments
    Khattak, Shehryar
    Papachristos, Christos
    Alexis, Kostas
    ADVANCES IN VISUAL COMPUTING, ISVC 2018, 2018, 11241 : 529 - 540
  • [2] Multi-modal Visual-Thermal Saliency-based Object Detection in Visually-degraded Environments
    Tsiourva, Maria
    Papachristos, Christos
    2020 IEEE AEROSPACE CONFERENCE (AEROCONF 2020), 2020,
  • [3] Fusion of visual odometry and inertial navigation system on a smartphone
    Tomazic, Simon
    Skrjanc, Igor
    COMPUTERS IN INDUSTRY, 2015, 74 : 119 - 134
  • [4] Visual landmarks facilitate rodent spatial navigation in virtual reality environments
    Youngstrom, Isaac A.
    Strowbridge, Ben W.
    LEARNING & MEMORY, 2012, 19 (03) : 84 - 90
  • [5] Motion Context Adaptive Fusion of Inertial and Visual Pedestrian Navigation
    Rantanen, Jesperi
    Makela, Maija
    Ruotsalainen, Laura
    Kirkko-Jaakkola, Martti
    2018 NINTH INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN 2018), 2018,
  • [6] Robust Multisensor Fusion for Reliable Mapping and Navigation in Degraded Visual Conditions
    Torchalla, Moritz
    Schnaubelt, Marius
    Daun, Kevin
    von Stryk, Oskar
    2021 IEEE INTERNATIONAL SYMPOSIUM ON SAFETY, SECURITY, AND RESCUE ROBOTICS (SSRR), 2021, : 110 - 117
  • [7] A Robust and Efficient Visual-Inertial SLAM for Vision-Degraded Environments
    Zhao, Xuhui
    Gao, Zhi
    Wang, Jialiang
    Lin, Zhipeng
    Zhou, Zhiyu
    Huang, Yue
    2024 IEEE 18TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION, ICCA 2024, 2024, : 981 - 987
  • [8] Radar Visual Inertial Odometry and Radar Thermal Inertial Odometry: Robust Navigation even in Challenging Visual Conditions
    Doer, Christopher
    Trommer, Gert F.
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 331 - 338
  • [9] Loosely Coupled Kalman Filtering for Fusion of Visual Odometry and Inertial Navigation
    Sirtkaya, Salim
    Seymen, Burak
    Alatan, A. Aydin
    2013 16TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2013, : 219 - 226
  • [10] MOBILE ROBOT NAVIGATION USING MONOCULAR VISUAL-INERTIAL FUSION
    Cai, Jianxian
    Gao, Penggang
    Wu, Yanxiong
    Gao, Zhitao
    MECHATRONIC SYSTEMS AND CONTROL, 2021, 49 (01): : 36 - 40