A Gesture Based Interface for Human-Robot Interaction

被引:0
|
作者
Stefan Waldherr
Roseli Romero
Sebastian Thrun
机构
[1] Carnegie Mellon University,Computer Science Department
[2] Universidade de São Paulo,Instituto de Ciências Matemáticas e de Computação
[3] Carnegie Mellon University,Computer Science Department
来源
Autonomous Robots | 2000年 / 9卷
关键词
gestures; human robot interaction; mobile robot navigation; service robots; visual template matching; hidden markov models; neural networks;
D O I
暂无
中图分类号
学科分类号
摘要
Service robotics is currently a highly active research area in robotics, with enormous societal potential. Since service robots directly interact with people, finding “natural” and easy-to-use user interfaces is of fundamental importance. While past work has predominately focussed on issues such as navigation and manipulation, relatively few robotic systems are equipped with flexible user interfaces that permit controlling the robot by “natural” means. This paper describes a gesture interface for the control of a mobile robot equipped with a manipulator. The interface uses a camera to track a person and recognize gestures involving arm motion. A fast, adaptive tracking algorithm enables the robot to track and follow a person reliably through office environments with changing lighting conditions. Two alternative methods for gesture recognition are compared: a template based approach and a neural network approach. Both are combined with the Viterbi algorithm for the recognition of gestures defined through arm motion (in addition to static arm poses). Results are reported in the context of an interactive clean-up task, where a person guides the robot to specific locations that need to be cleaned and instructs the robot to pick up trash.
引用
收藏
页码:151 / 173
页数:22
相关论文
共 50 条
  • [1] A gesture based interface for human-robot interaction
    Waldherr, S
    Romero, R
    Thrun, S
    AUTONOMOUS ROBOTS, 2000, 9 (02) : 151 - 173
  • [2] A Gesture-based Multimodal Interface for Human-Robot Interaction
    Uimonen, Mikael
    Kemppi, Paul
    Hakanen, Taru
    2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 165 - 170
  • [3] A flexible system for gesture based human-robot interaction
    Tellaeche, Alberto
    Kildal, Johan
    Maurtua, Inaki
    51ST CIRP CONFERENCE ON MANUFACTURING SYSTEMS, 2018, 72 : 57 - 62
  • [4] Human-robot interaction based on gesture and movement recognition
    Li, Xing
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2020, 81
  • [5] A Gesture-Based Natural Human-Robot Interaction Interface With Unrestricted Force Feedback
    Liang, Yinhao
    Du, Guanglong
    Li, Chunquan
    Chen, Chuxin
    Wang, Xueqian
    Liu, Peter X.
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [6] Gesture analysis for human-robot interaction
    Kim, KK
    Kwak, KC
    Chi, SY
    8TH INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY, VOLS 1-3: TOWARD THE ERA OF UBIQUITOUS NETWORKS AND SOCIETIES, 2006, : U1824 - U1827
  • [7] An experiment study of gesture-based human-robot interface
    Xu, Yong
    Gulillemot, Matthieu
    Nishida, Toyoaki
    2007 IEEE/ICME INTERNATIONAL CONFERENCE ON COMPLEX MEDICAL ENGINEERING, VOLS 1-4, 2007, : 457 - 463
  • [8] Evaluation of Hands-free Human-Robot Interaction Using a Head Gesture Based Interface
    Jackowski, Anja
    Gebhard, Marion
    COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 141 - 142
  • [9] Gesture-based human-robot interaction for human assistance in manufacturing
    Neto, Pedro
    Simao, Miguel
    Mendes, Nuno
    Safeea, Mohammad
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2019, 101 (1-4): : 119 - 135
  • [10] Research on multimodal human-robot interaction based on speech and gesture
    Deng Yongda
    Li Fang
    Xin Huang
    COMPUTERS & ELECTRICAL ENGINEERING, 2018, 72 : 443 - 454