A Vision-aided Inertial Navigation System for Agile High-speed Flight in Unmapped Environments

被引:0
|
作者
Steiner, Ted J. [1 ]
Truax, Robert D. [1 ]
Frey, Kristoffer [2 ]
机构
[1] Draper, 555 Technol Sq, Cambridge, MA 02139 USA
[2] MIT, Dept Aerosp & Aeronaut Engn, 77 Massachusetts Ave, Cambridge, MA 02139 USA
关键词
SIMULTANEOUS LOCALIZATION; VISUAL ODOMETRY; FILTER; SLAM;
D O I
暂无
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
Small, lightweight flight vehicles, such as consumer-grade quadrotors, are becoming increasingly common. These vehicles' on-board state estimators are typically reliant upon frequent and accurate updates from external systems such as the Global Positioning System (GPS) to provide state estimates required for stable flight. However, in many cases GPS signals may be unavailable or unreliable, and loss of GPS can cause these vehicles to go unstable or crash, potentially putting operators, bystanders, and property in danger. Thus reliance on GPS severely limits the robustness and operational capabilities of lightweight flight vehicles. This paper introduces the Smoothing And Mapping With Inertial State Estimation (SAMWISE) navigation system. SAMWISE is a vision-aided inertial navigation system capable of providing high-rate, low-latency state estimates to enable high-dynamic flight through obstacle-laden unmapped indoor and outdoor environments. SAMWISE offers a flexible framework for inertial navigation with nonlinear measurements, such as those produced by visual feature trackers, by utilizing an incremental smoother to efficiently optimize a set of nonlinear measurement constraints, estimating the vehicle trajectory in a sliding window in real-time with a slight processing delay. To overcome this delay and consistently produce state estimates at the high rates necessary for agile flight, we propose a novel formulation in which the smoother runs in a background thread while a low-latency inertial strapdown propagator outputs position, attitude, and velocity estimates at high-rate. We additionally propose a novel measurement buffering approach to seamlessly handle delayed measurements, measurements produced at inconsistent rates, and sensor data requiring significant processing time, such as camera imagery. We present experimental results high-speed flight with a fully autonomous quadrotor using SAMWISE for closed-loop state estimation from flight demonstrations during the DARPA Fast Lightweight Autonomy (FLA) program in April and November of 2016. SAMWISE achieved less than 1% position error and up to 5.5 m/s (12 mph) flight in a simulated indoor warehouse environment using a scanning-lidar, inertial measurement unit, and laser altimeter during the first FLA milestone event in April 2016. In November 2016, SAMWISE achieved approximately 3% error and up to 20 m/s (45 mph) flight in an open outdoor environment with large obstacles during the second FLA milestone event. The results of these flight tests demonstrate that our navigation system works robustly at high speed across multiple distinct environments.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Robust Outlier-Adaptive Filtering for Vision-Aided Inertial Navigation
    Lee, Kyuman
    Johnson, Eric N.
    SENSORS, 2020, 20 (07)
  • [32] Factored Extended Kalman Filter for Monocular Vision-Aided Inertial Navigation
    Magree, Daniel
    Johnson, Eric N.
    JOURNAL OF AEROSPACE INFORMATION SYSTEMS, 2016, 13 (12): : 475 - 490
  • [33] High-fidelity Sensor Modeling and Self-Calibration in Vision-aided Inertial Navigation
    Li, Mingyang
    Yu, Hongsheng
    Zheng, Xing
    Mourikis, Anastasios I.
    2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2014, : 409 - 416
  • [34] Autonomous Customized Quadrotor With Vision-Aided Navigation For Indoor Flight Challenges
    Wang Yongtian
    Jie, Toh Yan
    Gimin, Seo
    Kenny, Leck Bing Yu
    Srigrarom, Sutthiphong
    2022 8TH INTERNATIONAL CONFERENCE ON MECHATRONICS AND ROBOTICS ENGINEERING (ICMRE 2022), 2022, : 33 - 37
  • [35] Vision-Aided Inertial Navigation System with Point and Vertical Line Observations for Land Vehicle Applications
    Liu, Zhenbo
    Zhou, Qifan
    Qin, Yongyuan
    El-Sheimy, Naser
    CHINA SATELLITE NAVIGATION CONFERENCE (CSNC) 2017 PROCEEDINGS, VOL II, 2017, 438 : 445 - 457
  • [36] A Hybrid Sliding Window Optimizer for Tightly-Coupled Vision-Aided Inertial Navigation System
    Jiang, Junxiang
    Niu, Xiaoji
    Guo, Ruonan
    Liu, Jingnan
    SENSORS, 2019, 19 (15)
  • [37] Landmark database selection for vision-aided inertial navigation in planetary landing missions
    Xu, Chao
    Huang, Xiangyu
    Li, Maodeng
    Wang, Dayi
    AEROSPACE SCIENCE AND TECHNOLOGY, 2021, 118
  • [38] Vision-Aided Inertial Navigation with Line Features and a Rolling-Shutter Camera
    Yu, Hongsheng
    Mourikis, Anastasios I.
    2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 892 - 899
  • [39] MSCEqF: A Multi State Constraint Equivariant Filter for Vision-Aided Inertial Navigation
    Fornasier, Alessandro
    van Goor, Pieter
    Allak, Eren
    Mahony, Robert
    Weiss, Stephan
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (01) : 731 - 738
  • [40] Direct Feature Correspondence in Vision-Aided Inertial Navigation for Unmanned Aerial Vehicles
    Valles, Federico Paredes
    Magree, Daniel P.
    Johnson, Eric N.
    2017 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS (ICUAS'17), 2017, : 221 - 229