A multimodal language to communicate with life supporting robots through a touch screen and a speech interface

被引:0
|
作者
Oka, T. [1 ]
Matsumoto, H. [1 ]
Kibayashi, R. [1 ]
机构
[1] Nihon Univ, Coll Ind Technol, 1-2-1 Izumicho, Narashino, Chiba 2758575, Japan
关键词
Life-supporting robot; multi-modal language; speech; touch screen; human-robot interaction;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a multimodal language to communicate with life supporting robots through a touch screen and a speech interface. The language is designed for untrained users who need support in daily lives from cost effective robots. In this language, the users can combine spoken and pointing messages in an interactive manner in order to convey their intensions to robots. Spoken messages include verb and noun phrases which describe intensions. Pointing messages are given when users finger-touch a camera image, a picture containing a robot body, or a button on a touch screen at hand, which convey a location in their environment, a direction, a body part of the robot, a cue, a reply to a query, or other information to help the robot. This work presents the philosophy and structure of the language.
引用
收藏
页码:326 / 329
页数:4
相关论文
共 13 条
  • [1] A multimodal language to communicate with life-supporting robots through a touch screen and a speech interface
    Oka, T.
    Matsumoto, H.
    Kibayashi, R.
    ARTIFICIAL LIFE AND ROBOTICS, 2011, 16 (03) : 292 - 296
  • [2] User study of a life-supporting humanoid directed in a multimodal language
    Oka T.
    Abe T.
    Sugita K.
    Yokota M.
    Artificial Life and Robotics, 2011, 16 (2) : 224 - 228
  • [3] User study of a life-supporting humanoid directed in a multimodal language
    Oka, T.
    Abe, T.
    Sugita, K.
    Yokota, M.
    PROCEEDINGS OF THE SIXTEENTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL LIFE AND ROBOTICS (AROB 16TH '11), 2011, : 322 - 325
  • [4] Multimodal user interaction with in-car equipment in real conditions based on touch and speech modes in the Persian language
    Fateme Nazari
    Shima Tabibian
    Elaheh Homayounvala
    Multimedia Tools and Applications, 2023, 82 : 12995 - 13023
  • [5] Multimodal user interaction with in-car equipment in real conditions based on touch and speech modes in the Persian language
    Nazari, Fateme
    Tabibian, Shima
    Homayounvala, Elaheh
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (09) : 12995 - 13023
  • [6] Facilitating Client Ability to Communicate in Palliative End-of-Life Care Impact of Speech-Language Pathologists
    Pollens, Robin
    TOPICS IN LANGUAGE DISORDERS, 2020, 40 (03) : 264 - 277
  • [7] Supporting Crucial Conversations: Speech-Language Pathology Intervention in Palliative End-of-Life Care
    Pollens, Robin
    Chahda, Laura
    Freeman-Sanderson, Amy
    Lalonde Myers, Emilie
    Mathison, Bernice
    JOURNAL OF PALLIATIVE MEDICINE, 2021, 24 (07) : 969 - 970
  • [8] Studying Multimodal Interactions: Understanding Dialogue Mechanisms through Combined Observation of Speech, Language, and Body Movement
    Ishii, Ryo
    NTT Technical Review, 2021, 19 (11): : 13 - 17
  • [9] The role of speech-language pathologists in supporting theory of mind through literacy-based activities
    Secora, Kristen
    JOURNAL OF COMMUNICATION DISORDERS, 2024, 111
  • [10] Participatory Design: Addressing Speech-Language Pathologists' Challenges through Collaborative Interface Design
    Avant, James
    Zhang, Ting
    Seward, Renee
    Dugan, Sarah
    Schwab-Farrell, Sarah
    Li, Sarah R.
    Biehl, Sarah
    Boyce, Suzanne
    Riley, Michael A.
    Mast, T. Douglas
    DESIGN ISSUES, 2025, 41 (02) : 44 - 59