Human-Robot Interface for Unmanned Aerial Vehicle via a Leap Motion

被引:0
|
作者
Chen M. [1 ]
Liu C. [1 ]
Du G. [1 ]
Zhang P. [1 ]
机构
[1] School of Computer Science and Engineering, South China University of Technology, Guangzhou
来源
Journal of Beijing Institute of Technology (English Edition) | 2019年 / 28卷 / 01期
基金
中国国家自然科学基金;
关键词
Gesture control; Human-robot interface; Leap motion; Scheme autoadaption; Unmanned aerial vehicle(UAV);
D O I
10.15918/j.jbit1004-0579.18013
中图分类号
学科分类号
摘要
The unmanned aircraft vehicles industry is in the ascendant while traditional interaction ways for an unmanned aerial vehicle (UAV) are not intuitive enough. It is difficult for a beginner to control a UAV, therefore natural interaction methods are preferred. This paper presents a novel interactive control method for a UAV through operator爷s gesture, and explores the natural interaction method for the UAV. The proposed system uses the leap motion controller as an input device acquiring the gesture position and orientation data. It is found that the proposed human-robot interface can track the movement of the operator with satisfactory accuracy. The biggest advantage of the proposed method is its capability to control the UAV by just one hand instead of a joystick. A series of experiments verified the feasibility of the proposed human-robot interface. The results demonstrate that non-professional operators can easily operate a remote UAV by just using this system. © 2019 Editorial Department of Journal of Beijing Institute of Technology .
引用
收藏
页码:1 / 7
页数:6
相关论文
共 50 条
  • [31] Building a multimodal human-robot interface
    Perzanowski, D
    Schultz, AC
    Adams, W
    Marsh, E
    Bugajska, M
    IEEE INTELLIGENT SYSTEMS & THEIR APPLICATIONS, 2001, 16 (01): : 16 - 21
  • [32] On tracking of eye for human-robot interface
    Bhuiyan, MA
    Ampornaramveth, V
    Muto, S
    Ueno, H
    INTERNATIONAL JOURNAL OF ROBOTICS & AUTOMATION, 2004, 19 (01): : 42 - 54
  • [33] Multimodal Interface for Human-Robot Collaboration
    Rautiainen, Samu
    Pantano, Matteo
    Traganos, Konstantinos
    Ahmadi, Seyedamir
    Saenz, Jose
    Mohammed, Wael M.
    Lastra, Jose L. Martinez
    MACHINES, 2022, 10 (10)
  • [34] Human motion prediction for human-robot collaboration
    Liu, Hongyi
    Wang, Lihui
    JOURNAL OF MANUFACTURING SYSTEMS, 2017, 44 : 287 - 294
  • [35] On tracking of eye for human-robot interface
    Bhuiyan, M.A.
    Ampornaramveth, Vuthichai
    Muto, S.
    Ueno, Haruki
    International Journal of Robotics and Automation, 2004, 19 (01): : 42 - 54
  • [36] Evaluation of an enhanced human-robot interface
    Johnson, CA
    Adams, JA
    Kawamura, K
    2003 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOLS 1-5, CONFERENCE PROCEEDINGS, 2003, : 900 - 905
  • [37] Human-robot coordination with rotational motion
    Kim, KI
    Zheng, YF
    1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, 1998, : 3480 - 3485
  • [38] Preliminary design of controllers for the lateral motion of an unmanned aerial vehicle
    Kovacs, J.
    Szegedi, P.
    Ovari, G.
    Transport Means 2006, Proceedings, 2006, : 328 - 331
  • [39] Motion Detection Algorithm for Unmanned Aerial Vehicle Nighttime Surveillance
    Xiao, Huaxin
    Liu, Yu
    Wang, Wei
    Zhang, Maojun
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2014, E97D (12): : 3248 - 3251
  • [40] Control of Hybrid Unmanned Aerial Vehicle Motion in Transitional Modes
    S. A. Belokon
    D. S. Derishev
    Yu. N. Zolotukhin
    A. A. Nesterov
    M. N. Filippov
    Optoelectronics, Instrumentation and Data Processing, 2019, 55 : 346 - 355