Multimodal Behavior Analysis of Human-Robot Navigational Commands

被引:0
|
作者
Priyanayana, K. S. [1 ]
Jayasekara, A. G. Buddhika P. [1 ]
Gopura, R. A. R. C. [2 ]
机构
[1] Univ Moratuwa, Dept Elect Engn, Moratuwa, Sri Lanka
[2] Univ Moratuwa, Dept Mech Engn, Moratuwa, Sri Lanka
关键词
Human robot interaction; Social robotics; Non verbal communication; Multimodal interaction;
D O I
10.1109/ICCR51572.2020.9344419
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human-Robot Interactions have to be more human like and human-human communications are multimodal interactions. In general communication, humans tend to use multiple modalities at a time to convey a message through. Multimodal interactions could be in many modalities such as gestures, speech, gaze and etc. Major multimodal human-human combination is the speech-hand gesture interaction. Hand gestures are used in a diverse range in these interactions. They add different meanings and enhance the understanding of the complete interaction in multiple dimensions. Purpose of this paper is to conduct a comprehensive analysis on multimodal relationship of speech-hand gesture interaction and its effect on the true meaning of interactions. Therefore this paper will focus on different aspects of each modality with regards to multimodal interactions such as vocal uncertainties, static and dynamic hand gestures, deictic, redundant and unintentional gestures, their timeline parameters, hand features and etc. Furthermore this paper discuss the effect of each speech-gesture parameter on understanding of the vocal ambiguities. Complete analysis of these aspects was conducted through detailed human study and results are interpreted through above multimodal aspects. Further vocal commands are analyzed using different vocal categories and different types of uncertainties. Hand gestures are analyzed though timeline parameters and hand feature analysis. For the timeline analysis, parameters were decided from the feedback of the participants on effectiveness of each parameter. Lag time, speed of the gesture movements and range of the gesture were considered for the timeline analysis.
引用
收藏
页码:79 / 84
页数:6
相关论文
共 50 条
  • [21] A multimodal teleoperation interface for human-robot collaboration
    Si, Weiyong
    Zhong, Tianjian
    Wang, Ning
    Yang, Chenguang
    2023 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS, ICM, 2023,
  • [22] Safe Multimodal Communication in Human-Robot Collaboration
    Ferrari, Davide
    Pupa, Andrea
    Signoretti, Alberto
    Secchi, Cristian
    HUMAN-FRIENDLY ROBOTICS 2023, HFR 2023, 2024, 29 : 151 - 163
  • [23] A Dialogue System for Multimodal Human-Robot Interaction
    Lucignano, Lorenzo
    Cutugno, Francesco
    Rossi, Silvia
    Finzi, Alberto
    ICMI'13: PROCEEDINGS OF THE 2013 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2013, : 197 - 204
  • [24] Multimodal Information Fusion for Human-Robot Interaction
    Luo, Ren C.
    Wu, Y. C.
    Lin, P. H.
    2015 IEEE 10TH JUBILEE INTERNATIONAL SYMPOSIUM ON APPLIED COMPUTATIONAL INTELLIGENCE AND INFORMATICS (SACI), 2015, : 535 - 540
  • [25] MULTIMODAL DATA COMMUNICATION FOR HUMAN-ROBOT INTERACTIONS
    Wallhoff, Frank
    Rehrl, Tobias
    Gast, Juergen
    Bannat, Alexander
    Rigoll, Gerhard
    ICME: 2009 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOLS 1-3, 2009, : 1146 - 1149
  • [26] Affective Human-Robot Interaction with Multimodal Explanations
    Zhu, Hongbo
    Yu, Chuang
    Cangelosi, Angelo
    SOCIAL ROBOTICS, ICSR 2022, PT I, 2022, 13817 : 241 - 252
  • [27] Improving Human-Robot Interaction by a Multimodal Interface
    Ubeda, Andres
    Ianez, Eduardo
    Azorin, Jose M.
    Sabater, Jose M.
    Garcia, Nicolas M.
    Perez, Carlos
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010, : 3580 - 3585
  • [28] Enabling multimodal human-robot interaction for the Karlsruhe humanoid robot
    Stiefelhagen, Rainer
    Ekenel, Hazim Kemal
    Fugen, Christian
    Gieselmann, Petra
    Holzapfel, Hartwig
    Kraft, Florian
    Nickel, Kai
    Voit, Michael
    Waibel, Alex
    IEEE TRANSACTIONS ON ROBOTICS, 2007, 23 (05) : 840 - 851
  • [29] Designing a Multimodal Human-Robot Interaction Interface for an Industrial Robot
    Mocan, Bogdan
    Fulea, Mircea
    Brad, Stelian
    ADVANCES IN ROBOT DESIGN AND INTELLIGENT CONTROL, 2016, 371 : 255 - 263
  • [30] Timing of Multimodal Robot Behaviors during Human-Robot Collaboration
    Jensen, Lars Christian
    Fischer, Kerstin
    Suvei, Stefan-Daniel
    Bodenhagen, Leon
    2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 1061 - 1066