Enhancing Human-Robot Interaction by Interpreting Uncertain Information in Navigational Commands Based on Experience and Environment

被引:0
|
作者
Muthugala, M. A. Viraj J. [1 ]
Jayasekara, A. G. Buddhika P. [1 ]
机构
[1] Univ Moratuwa, Dept Elect Engn, Robot & Control Lab, Moratuwa 10400, Sri Lanka
关键词
understanding uncertain information; humanrobot interactions; human friendly robot; assistive robots; experience of robots; FUZZY VOICE COMMANDS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Assistive robots can support activities of elderly people to uplift the living standard. The assistive robots should possess the ability to interact with the human peers in a human friendly manner because those systems are intended to be used by non-experts. Humans prefer to use voice instructions that include uncertain information and lexical symbols. Hence, the ability to understand uncertain information is mandatory for developing natural interaction capabilities in robots. This paper proposes a method to understand uncertain information such as "close", "near" and "far" in navigational user commands based on the current environment and the experience of the robot. A robot experience model (REM) has been introduced to understand the lexical representations in user commands and to adapt the perception of the robot on uncertain information in heterogeneous domestic environments. The user commands are not bounded by a strict grammar model and this enables the users to operate the robot in a more natural way. The proposed method has been implemented on the assistive robot platform. The experiments have been carried out in an artificially created domestic environment and the results have been analyzed to identify the behaviors of the proposed concept.
引用
收藏
页码:2915 / 2921
页数:7
相关论文
共 50 条
  • [41] A Gesture Based Interface for Human-Robot Interaction
    Stefan Waldherr
    Roseli Romero
    Sebastian Thrun
    Autonomous Robots, 2000, 9 : 151 - 173
  • [42] A gesture based interface for human-robot interaction
    Waldherr, S
    Romero, R
    Thrun, S
    AUTONOMOUS ROBOTS, 2000, 9 (02) : 151 - 173
  • [43] Object Modeling for Environment Perception through Human-Robot Interaction
    Kim, Soohwan
    Kim, Dong Hwan
    Park, Sung-Kee
    INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2010), 2010, : 2328 - 2333
  • [44] Individual Differences in Human-Robot Interaction in a Military Multitasking Environment
    Chen, Jessie Y. C.
    JOURNAL OF COGNITIVE ENGINEERING AND DECISION MAKING, 2011, 5 (01) : 83 - 105
  • [45] Managing changes in the environment of human-robot interaction and welfare services
    Tuisku, Outi
    Parjanen, Satu
    Hyypiae, Mirva
    Pekkarinen, Satu
    INFORMATION TECHNOLOGY & MANAGEMENT, 2024, 25 (01): : 1 - 18
  • [46] Human-Robot Interaction Based on Frankl Psychology for a Partner Robot
    Masuta, Hiroyuki
    Onishi, Tsuyoshi
    Lim, Hun-ok
    2012 PROCEEDINGS OF SICE ANNUAL CONFERENCE (SICE), 2012, : 79 - 84
  • [47] Human-robot cooperative control based on pHRI (Physical Human-Robot Interaction) of exoskeleton robot for a human upper extremity
    Heedon Lee
    Byeongkyu Lee
    Wansoo Kim
    Myeongsoo Gil
    Jungsoo Han
    Changsoo Han
    International Journal of Precision Engineering and Manufacturing, 2012, 13 : 985 - 992
  • [48] How to include User eXperience in the design of Human-Robot Interaction
    Prati, Elisa
    Peruzzini, Margherita
    Pellicciari, Marcello
    Raffaeli, Roberto
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2021, 68
  • [49] Human-robot cooperative control based on pHRI (Physical Human-Robot Interaction) of exoskeleton robot for a human upper extremity
    Lee, Heedon
    Lee, Byeongkyu
    Kim, Wansoo
    Gil, Myeongsoo
    Han, Jungsoo
    Han, Changsoo
    INTERNATIONAL JOURNAL OF PRECISION ENGINEERING AND MANUFACTURING, 2012, 13 (06) : 985 - 992
  • [50] Speech-based Human-Robot Interaction Robust to Acoustic Reflections in Real Environment
    Gomez, Randy
    Inoue, Koji
    Nakamura, Keisuke
    Mizumoto, Takeshi
    Nakadai, Kazuhiro
    2014 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2014), 2014, : 1367 - 1373