Understanding and learning of gestures through human-robot interaction

被引:0
|
作者
Kuno, Y [1 ]
Murashima, T [1 ]
Shimada, N [1 ]
Shirai, Y [1 ]
机构
[1] Osaka Univ, Dept Comp Controlled Mech Syst, Suita, Osaka 5650871, Japan
关键词
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Humans can communicate each other by gestures. Even if they cannot recognize them at first, they can understand each other through interaction. This paper presents a robot system with such capability. The robot detects its user by recognizing his/her face. It can accept his/her commands given by gestures. The user may use unknown gestures to the robot. If the robot does not respond to his/her gesture, the user usually iterates the same gesture. The robot detects this repetitive pattern as an intentional gesture by which the user wants to give it some order. Then it shows a little action according to the meaning of the gesture that the robot guesses. It observes the user's reaction to its action. If he/she continues the same pattern gesture, the robot considers that its understanding is right, completing the action. It also registers the pattern as a gesture of the guessed meaning. Otherwise, it iterates the same procedure by taking another action as a candidate of the meaning. We have implemented such interactive capability on our intelligent wheelchair It is convenient that we can make it come or go by gestures when we are off. Experimental results confirm that the proposed interaction method is useful in actual complex environments where even registered gestures cannot always be recognized.
引用
收藏
页码:2133 / 2138
页数:6
相关论文
共 50 条
  • [1] Incremental learning of gestures for human-robot interaction
    Okada, Shogo
    Kobayashi, Yoichi
    Ishibashi, Satoshi
    Nishida, Toyoaki
    AI & SOCIETY, 2010, 25 (02) : 155 - 168
  • [2] Human-Robot Interaction by Understanding Upper Body Gestures
    Xiao, Yang
    Zhang, Zhijun
    Beck, Aryel
    Yuan, Junsong
    Thalmann, Daniel
    PRESENCE-VIRTUAL AND AUGMENTED REALITY, 2014, 23 (02): : 133 - 154
  • [3] Pantomimic Gestures for Human-Robot Interaction
    Burke, Michael
    Lasenby, Joan
    IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1225 - 1237
  • [4] Conversational Gestures in Human-Robot Interaction
    Bremner, Paul
    Pipe, Anthony
    Melhuish, Chris
    Fraser, Mike
    Subramanian, Sriram
    2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 1645 - +
  • [5] Recognizing Unfamiliar Gestures for Human-Robot Interaction Through Zero-Shot Learning
    Thomason, Wil
    Knepper, Ross A.
    2016 INTERNATIONAL SYMPOSIUM ON EXPERIMENTAL ROBOTICS, 2017, 1 : 841 - 852
  • [6] Learning, Generating and Adapting Wave Gestures for Expressive Human-Robot Interaction
    Panteris, Michail
    Manschitz, Simon
    Calinon, Sylvain
    HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 386 - 388
  • [7] Unsupervised Simultaneous Learning of Gestures, Actions and their Associations for Human-Robot Interaction
    Mohammad, Yasser
    Nishida, Toyoaki
    Okada, Shogo
    2009 IEEE-RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2009, : 2537 - 2544
  • [8] Human-Robot Interaction Using Pointing Gestures
    Tolgyessy, Michal
    Dekan, Martin
    Hubinsky, Peter
    ISCSIC'18: PROCEEDINGS OF THE 2ND INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND INTELLIGENT CONTROL, 2018,
  • [9] Integration of Gestures and Speech in Human-Robot Interaction
    Meena, Raveesh
    Jokinen, Kristiina
    Wilcock, Graham
    3RD IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFOCOMMUNICATIONS (COGINFOCOM 2012), 2012, : 673 - 678
  • [10] Pointing Gestures for Human-Robot Interaction with the Humanoid Robot Digit
    Lorentz, Viktor
    Weiss, Manuel
    Hildebrand, Kristian
    Boblan, Ivo
    2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 1886 - 1892