A multimodal language to communicate with life-supporting robots through a touch screen and a speech interface

被引:0
|
作者
Oka, T. [1 ,2 ]
Matsumoto, H. [1 ,2 ]
Kibayashi, R. [1 ,2 ]
机构
[1] Nihon Univ, Coll Ind Technol, 1-2-1 Izumicho, Narashino, Chiba 2758575, Japan
[2] Fukuoka Inst Technol, Fac Informat Engn, Fukuoka, Japan
关键词
Life-supporting robot; Multimodal language; Speech; Touch screen; Human-robot interaction;
D O I
10.1007/s10015-011-0924-x
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This article proposes a multimodal language to communicate with life-supporting robots through a touch screen and a speech interface. The language is designed for untrained users who need support in their daily lives from cost-effective robots. In this language, the users can combine spoken and pointing messages in an interactive manner in order to convey their intentions to the robots. Spoken messages include verb and noun phrases which describe intentions. Pointing messages are given when the user's finger touches a camera image, a picture containing a robot body, or a button on a touch screen at hand which convey a location in their environment, a direction, a body part of the robot, a cue, a reply to a query, or other information to help the robot. This work presents the philosophy and structure of the language.
引用
收藏
页码:292 / 296
页数:5
相关论文
共 15 条
  • [1] A multimodal language to communicate with life supporting robots through a touch screen and a speech interface
    Oka, T.
    Matsumoto, H.
    Kibayashi, R.
    PROCEEDINGS OF THE SIXTEENTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL LIFE AND ROBOTICS (AROB 16TH '11), 2011, : 326 - 329
  • [2] User study of a life-supporting humanoid directed in a multimodal language
    Oka T.
    Abe T.
    Sugita K.
    Yokota M.
    Artificial Life and Robotics, 2011, 16 (2) : 224 - 228
  • [3] User study of a life-supporting humanoid directed in a multimodal language
    Oka, T.
    Abe, T.
    Sugita, K.
    Yokota, M.
    PROCEEDINGS OF THE SIXTEENTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL LIFE AND ROBOTICS (AROB 16TH '11), 2011, : 322 - 325
  • [4] Study on Life-supporting Balloon Walking and Jumping Robots for the Elderly and Children
    Toyoda, Nozomi
    Nishida, Mami
    2013 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION (SII), 2013, : 294 - 299
  • [5] Gesture detection based on 3D tracking for multimodal communication with a life-supporting robot
    Oka, Tetsushi
    Kibayashi, Ryuichi
    Matsumoto, Hirosato
    PROCEEDINGS OF THE SEVENTEENTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL LIFE AND ROBOTICS (AROB 17TH '12), 2012, : 459 - 462
  • [6] Multimodal user interaction with in-car equipment in real conditions based on touch and speech modes in the Persian language
    Fateme Nazari
    Shima Tabibian
    Elaheh Homayounvala
    Multimedia Tools and Applications, 2023, 82 : 12995 - 13023
  • [7] Multimodal user interaction with in-car equipment in real conditions based on touch and speech modes in the Persian language
    Nazari, Fateme
    Tabibian, Shima
    Homayounvala, Elaheh
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (09) : 12995 - 13023
  • [8] Facilitating Client Ability to Communicate in Palliative End-of-Life Care Impact of Speech-Language Pathologists
    Pollens, Robin
    TOPICS IN LANGUAGE DISORDERS, 2020, 40 (03) : 264 - 277
  • [9] Supporting Crucial Conversations: Speech-Language Pathology Intervention in Palliative End-of-Life Care
    Pollens, Robin
    Chahda, Laura
    Freeman-Sanderson, Amy
    Lalonde Myers, Emilie
    Mathison, Bernice
    JOURNAL OF PALLIATIVE MEDICINE, 2021, 24 (07) : 969 - 970
  • [10] Studying Multimodal Interactions: Understanding Dialogue Mechanisms through Combined Observation of Speech, Language, and Body Movement
    Ishii, Ryo
    NTT Technical Review, 2021, 19 (11): : 13 - 17