EVPropNet: Detecting Drones By Finding Propellers For Mid-Air Landing And Following

被引:0
|
作者
Sanket, Nitin J. [1 ]
Singh, Chahat Deep [1 ]
Parameshwara, Chethan M. [1 ]
Fermueller, Cornelia [1 ]
de Croon, Guido C. H. E. [2 ]
Aloimonos, Yiannis [1 ]
机构
[1] Univ Maryland, Percept & Robot Grp, Inst Adv Comp Studies, College Pk, MD 20742 USA
[2] Delft Univ Technol, Micro Air Vehicle Lab, Delft, Netherlands
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The rapid rise of accessibility of unmanned aerial vehicles or drones pose a threat to general security and confidentiality. Most of the commercially available or custom-built drones are multi-rotors and are comprised of multiple propellers. Since these propellers rotate at a high-speed, they are generally the fastest moving parts of an image and cannot be directly "seen" by a classical camera without severe motion blur. We utilize a class of sensors that are particularly suitable for such scenarios called event cameras, which have a high temporal resolution, low-latency, and high dynamic range. In this paper, we model the geometry of a propeller and use it to generate simulated events which are used to train a deep neural network called EVPropNet to detect propellers from the data of an event camera. EVPropNet directly transfers to the real world without any fine-tuning or retraining. We present two applications of our network: (a) tracking and following an unmarked drone and (b) landing on a near-hover drone. We successfully evaluate and demonstrate the proposed approach in many real-world experiments with different propeller shapes and sizes. Our network can detect propellers at a rate of 85.1% even when 60% of the propeller is occluded and can run at upto 35Hz on a 2W power budget. To our knowledge, this is the first deep learning-based solution for detecting propellers (to detect drones). Finally, our applications also show an impressive success rate of 92% and 90% for the tracking and landing tasks respectively.
引用
收藏
页数:11
相关论文
共 7 条
  • [1] Light-Weight Wireless Power Transfer for Mid-Air Charging of Drones
    Aldhaher, Samer
    Mitcheson, Paul D.
    Arteaga, Juan M.
    Kkelis, George
    Yates, David C.
    2017 11TH EUROPEAN CONFERENCE ON ANTENNAS AND PROPAGATION (EUCAP), 2017, : 336 - 340
  • [2] Designing Telepresence Drones to Support Synchronous, Mid-air Remote Collaboration: An Exploratory Study
    Sabet, Mehrnaz
    Orand, Mania
    McDonald, David W.
    CHI '21: PROCEEDINGS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2021,
  • [3] Detecting Mid-Air Gestures for Digit Writing With Radio Sensors and a CNN
    Leem, Seong Kyu
    Khan, Faheem
    Cho, Sung Ho
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2020, 69 (04) : 1066 - 1081
  • [4] TiPoint: Detecting Fingertip for Mid-Air Interaction on Computational Resource Constrained Smartglasses
    Lee, Lik Hang
    Braud, Tristan
    Bijarbooneh, Farshid Hassani
    Hui, Pan
    ISWC'19: PROCEEDINGS OF THE 2019 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS, 2019, : 118 - 122
  • [5] Design process and real-time validation of an innovative autonomous mid-air flight and landing system
    De Lellis, E.
    Di Vito, V.
    Garbarino, L.
    Lai, C.
    Corraro, F.
    World Academy of Science, Engineering and Technology, 2011, 55 : 174 - 184
  • [6] Orienting in Mid-air through Configuration Changes to Achieve a Rolling Landing for Reducing Impact after a Fall
    Bingham, Jeffrey T.
    Lee, Jeongseok
    Haksar, Ravi N.
    Ueda, Jun
    Liu, C. Karen
    2014 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2014), 2014, : 3610 - 3617
  • [7] Mid-air Tactile Display Using Indirect Laser Radiation for Contour-Following Stimulation and Assessment of Its Spatial Acuity
    Cha, Hojun
    Lee, Hojin
    Park, Junsuk
    Kim, Hyung-Sik
    Chung, Soon-Cheol
    Choi, Seungmoon
    2017 IEEE WORLD HAPTICS CONFERENCE (WHC), 2017, : 136 - 141