Multimodal Behavior Analysis of Human-Robot Navigational Commands

被引:0
|
作者
Priyanayana, K. S. [1 ]
Jayasekara, A. G. Buddhika P. [1 ]
Gopura, R. A. R. C. [2 ]
机构
[1] Univ Moratuwa, Dept Elect Engn, Moratuwa, Sri Lanka
[2] Univ Moratuwa, Dept Mech Engn, Moratuwa, Sri Lanka
关键词
Human robot interaction; Social robotics; Non verbal communication; Multimodal interaction;
D O I
10.1109/ICCR51572.2020.9344419
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human-Robot Interactions have to be more human like and human-human communications are multimodal interactions. In general communication, humans tend to use multiple modalities at a time to convey a message through. Multimodal interactions could be in many modalities such as gestures, speech, gaze and etc. Major multimodal human-human combination is the speech-hand gesture interaction. Hand gestures are used in a diverse range in these interactions. They add different meanings and enhance the understanding of the complete interaction in multiple dimensions. Purpose of this paper is to conduct a comprehensive analysis on multimodal relationship of speech-hand gesture interaction and its effect on the true meaning of interactions. Therefore this paper will focus on different aspects of each modality with regards to multimodal interactions such as vocal uncertainties, static and dynamic hand gestures, deictic, redundant and unintentional gestures, their timeline parameters, hand features and etc. Furthermore this paper discuss the effect of each speech-gesture parameter on understanding of the vocal ambiguities. Complete analysis of these aspects was conducted through detailed human study and results are interpreted through above multimodal aspects. Further vocal commands are analyzed using different vocal categories and different types of uncertainties. Hand gestures are analyzed though timeline parameters and hand feature analysis. For the timeline analysis, parameters were decided from the feedback of the participants on effectiveness of each parameter. Lag time, speed of the gesture movements and range of the gesture were considered for the timeline analysis.
引用
收藏
页码:79 / 84
页数:6
相关论文
共 50 条
  • [11] Extending Commands Embedded in Actions for Human-Robot Cooperative Tasks
    Kazuki Kobayashi
    Seiji Yamada
    International Journal of Social Robotics, 2010, 2 : 159 - 173
  • [12] Human-robot cooperative sweeping using commands embedded in actions
    Kobayashi, Kazuki
    Yamada, Seiji
    NEW FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2007, 3609 : 252 - +
  • [13] Human-robot cooperative sweeping by extending commands embedded in actions
    Kobayashi, K
    Yamada, S
    2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, : 1827 - 1832
  • [14] Web-based Remote Navigational Robot for Multiclass Human-Robot Interaction
    Yeoh, Kenny Ju Min
    Wong, Hwee Ling
    2012 IEEE CONFERENCE ON SUSTAINABLE UTILIZATION AND DEVELOPMENT IN ENGINEERING AND TECHNOLOGY (STUDENT), 2012, : 170 - 175
  • [15] The impact of human-robot multimodal communication on mental workload, usability preference, and expectations of robot behavior
    Abich, Julian
    Barber, Daniel J.
    JOURNAL ON MULTIMODAL USER INTERFACES, 2017, 11 (02) : 211 - 225
  • [16] Multimodal Analysis of Impressions and Personality in Human-Computer and Human-Robot Interactions
    Gunes, Hatice
    PROCEEDINGS OF THE 6TH INTERNATIONAL WORKSHOP ON AUDIO/VISUAL EMOTION CHALLENGE (AVEC'16), 2016, : 1 - 2
  • [17] Natural multimodal communication for human-robot collaboration
    Maurtua, Inaki
    Fernandez, Izaskun
    Tellaeche, Alberto
    Kildal, Johan
    Susperregi, Loreto
    Ibarguren, Aitor
    Sierra, Basilio
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2017, 14 (04): : 1 - 12
  • [18] Recent advancements in multimodal human-robot interaction
    Su, Hang
    Qi, Wen
    Chen, Jiahao
    Yang, Chenguang
    Sandoval, Juan
    Laribi, Med Amine
    FRONTIERS IN NEUROROBOTICS, 2023, 17
  • [19] Human Behavior Analysis in Human-Robot Cooperation with AR Glasses
    Owaki, Koichi
    Techasarntiku, Nattaon
    Shimonishi, Hideyuki
    2023 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, ISMAR, 2023, : 20 - 28
  • [20] Human behavior analysis for human-robot interaction in indoor environments
    Park, Jung-Eun
    Oh, Kyung-Whan
    AI 2007: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2007, 4830 : 762 - 768