Human-Robot Interface for Unmanned Aerial Vehicle via a Leap Motion

被引:0
|
作者
Chen M. [1 ]
Liu C. [1 ]
Du G. [1 ]
Zhang P. [1 ]
机构
[1] School of Computer Science and Engineering, South China University of Technology, Guangzhou
来源
Journal of Beijing Institute of Technology (English Edition) | 2019年 / 28卷 / 01期
基金
中国国家自然科学基金;
关键词
Gesture control; Human-robot interface; Leap motion; Scheme autoadaption; Unmanned aerial vehicle(UAV);
D O I
10.15918/j.jbit1004-0579.18013
中图分类号
学科分类号
摘要
The unmanned aircraft vehicles industry is in the ascendant while traditional interaction ways for an unmanned aerial vehicle (UAV) are not intuitive enough. It is difficult for a beginner to control a UAV, therefore natural interaction methods are preferred. This paper presents a novel interactive control method for a UAV through operator爷s gesture, and explores the natural interaction method for the UAV. The proposed system uses the leap motion controller as an input device acquiring the gesture position and orientation data. It is found that the proposed human-robot interface can track the movement of the operator with satisfactory accuracy. The biggest advantage of the proposed method is its capability to control the UAV by just one hand instead of a joystick. A series of experiments verified the feasibility of the proposed human-robot interface. The results demonstrate that non-professional operators can easily operate a remote UAV by just using this system. © 2019 Editorial Department of Journal of Beijing Institute of Technology .
引用
收藏
页码:1 / 7
页数:6
相关论文
共 50 条
  • [21] Effects of Robot Motion on Human-Robot Collaboration
    Dragan, Anca D.
    Bauman, Shira
    Forlizzi, Jodi
    Srinivasa, Siddhartha S.
    PROCEEDINGS OF THE 2015 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'15), 2015, : 51 - 58
  • [22] Development and validation of a human-machine interface for unmanned aerial vehicle (UAV) control via hand gesture teleoperation
    Bolat, Fevzi cakmak
    Avci, Mustafa Cem
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 273
  • [23] Command-based voice teleoperation of a mobile robot via a human-robot interface
    Poncela, Alberto
    Gallardo-Estrella, Leticia
    ROBOTICA, 2015, 33 (01) : 1 - 18
  • [24] Context-dependent Human-robot Interaction using Indicating motion via Virtual-City Interface
    Sato-Shimokawara, Eri
    Fukusato, Yusuke
    Nakazato, Jun
    Yamaguchi, Toru
    2008 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1-5, 2008, : 1924 - 1929
  • [25] Human-robot interface for remote control via IoT communication using deep learning techniques for motion recognition
    Martinelli, Dieisson
    Cerbaro, Jonathan
    Fabro, Joao Alberto
    de Oliveira, Andre Schneider
    Simoes Teixeira, Marco Antonio
    2020 XVIII LATIN AMERICAN ROBOTICS SYMPOSIUM, 2020 XII BRAZILIAN SYMPOSIUM ON ROBOTICS AND 2020 XI WORKSHOP OF ROBOTICS IN EDUCATION (LARS-SBR-WRE 2020), 2020, : 31 - 36
  • [26] A human-robot interface for mobile manipulator
    Chen, Mingxuan
    Liu, Caibing
    Du, Guanglong
    INTELLIGENT SERVICE ROBOTICS, 2018, 11 (03) : 269 - 278
  • [27] Vision System for Human-Robot Interface
    Islam, Md Ezharul
    Begum, Nasima
    Bhuiyan, Md. Al-Amin
    2008 11TH INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY: ICCIT 2008, VOLS 1 AND 2, 2008, : 617 - 621
  • [28] EXPRESSION OF EMOTIONS THROUGH BODY MOTION A Novel Interface For Human-Robot Interaction
    Goncalves, Nelson
    Sequeira, Joao
    ICINCO 2009: PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, VOL 2: ROBOTICS AND AUTOMATION, 2009, : 465 - 470
  • [29] Real-time arm motion imitation for human-robot tangible interface
    Choi, Yukyung
    Ra, SyungKwon
    Kim, Soowhan
    Park, Sung-Kee
    INTELLIGENT SERVICE ROBOTICS, 2009, 2 (02) : 61 - 69
  • [30] A human-robot interface based on electrooculography
    Chen, YX
    Newman, WS
    2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, : 243 - 248