Vision-Based Autonomous Landing on Unprepared Field With Rugged Surface

被引:1
|
作者
Liu, Zhifa [1 ,2 ]
Wang, Chunyuan [1 ,2 ]
Chen, Kejing [1 ,2 ]
Meng, Wei [1 ,2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
[2] Guangdong Prov Key Lab Intelligent Decis & Cooper, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
Terms- Autonomous landing; multiview stereo; neural network (NN); unmanned aerial vehicle (UAV); MULTIVIEW STEREO;
D O I
10.1109/JSEN.2022.3194190
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Landing on unprepared terrains is a challenging task for fully autonomous micro unmanned aerial vehicles (UAVs). Most of the existing methods mainly rely on manual control when the terrain of the target area is unknown in advance. In this article, we propose an autonomous landing method based on a learning-based multi-view stereo (MVS) system. UAV acquires multiple RGB pictures of the terrain after cruise, and then extracts speed-up robust features (SURF) and perform structure from motion (SFM) to obtain sparse feature point clouds. By utilizing the generated initial depth map, we further propose a novel 3-D reconstruction algorithm named PatchmatchNet-A, which can help to obtain precise and stable estimates of the depth information. In PatchmatchNet-A, we also use a new activation function, adjustable-arctangent linear units (ALU), to improve the accuracy and robustness. We have tested our algorithms in a DTU dataset and found that it can speed up the processing by around 25% while guaranteeing competitive performance. Flight experiments have also been conducted to verify the effectiveness of the whole landing system by using a commercial M210V2 quadcopter.
引用
收藏
页码:17914 / 17923
页数:10
相关论文
共 50 条
  • [41] A hierarchical vision-based localization of rotor unmanned aerial vehicles for autonomous landing
    Yuan, Haiwen
    Xiao, Changshi
    Xiu, Supu
    Zhan, Wenqiang
    Ye, Zhenyi
    Zhang, Fan
    Zhou, Chunhui
    Wen, Yuanqiao
    Li, Qiliang
    INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS, 2018, 14 (09):
  • [42] A software platform for vision-based UAV autonomous landing guidance based on markers estimation
    XiaoBin Xu
    Zhao Wang
    YiMin Deng
    Science China Technological Sciences, 2019, 62 : 1825 - 1836
  • [43] Vision-based docking using an Autonomous Surface Vehicle
    Dunbabin, Matthew
    Lang, Brenton
    Wood, Brett
    2008 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-9, 2008, : 26 - +
  • [44] Stereo Vision-Based Navigation for Autonomous Surface Vessels
    Huntsberger, Terry
    Aghazarian, Hrand
    Howard, Andrew
    Trotz, David C.
    JOURNAL OF FIELD ROBOTICS, 2011, 28 (01) : 3 - 18
  • [45] Autonomous Landing on a Moving Platform Using Vision-Based Deep Reinforcement Learning
    Ladosz, Pawel
    Mammadov, Meraj
    Shin, Heejung
    Shin, Woojae
    Oh, Hyondong
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (05) : 4575 - 4582
  • [46] A software platform for vision-based UAV autonomous landing guidance based on markers estimation
    Xu XiaoBin
    Wang Zhao
    Deng YiMin
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2019, 62 (10) : 1825 - 1836
  • [47] Vision-based Autonomous Landing of UAV on Moving Platform using a New Marker
    Huang Xiaoyun
    Xu Qing
    Wang Jianqiang
    2019 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE APPLICATIONS AND TECHNOLOGIES (AIAAT 2019), 2019, 646
  • [48] Autonomous landing of a quadrotor on a moving platform using vision-based FOFPID control
    Ghasemi, Ali
    Parivash, Farhad
    Ebrahimian, Serajeddin
    ROBOTICA, 2022, 40 (05) : 1431 - 1449
  • [49] A software platform for vision-based UAV autonomous landing guidance based on markers estimation
    XU XiaoBin
    WANG Zhao
    DENG YiMin
    Science China(Technological Sciences) , 2019, (10) : 1825 - 1836
  • [50] Vision-Based Air-to-Air Autonomous Landing of Underactuated VTOL UAVs
    Roggi, Gabriele
    Gozzini, Giovanni
    Invernizzi, Davide
    Lovera, Marco
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2024, 29 (03) : 2338 - 2349