Infants understand the referential nature of human gaze but not robot gaze

被引:30
|
作者
Okumura, Yuko [1 ]
Kanakogi, Yasuhiro [1 ]
Kanda, Takayuki [2 ]
Ishiguro, Hiroshi [2 ,3 ]
Itakura, Shoji [1 ]
机构
[1] Kyoto Univ, Grad Sch Letters, Dept Psychol, Kyoto 6068501, Japan
[2] ATR, Intelligent Robot & Commun Labs, Kyoto 6190288, Japan
[3] Osaka Univ, Grad Sch Engn Sci, Dept Syst Innovat, Osaka 5608531, Japan
关键词
Infants; Referential gaze; Gaze following; Humanoid robot; Anticipatory eye movements; Eye tracking; OBJECT; FOLLOW; BUILD;
D O I
10.1016/j.jecp.2013.02.007
中图分类号
B844 [发展心理学(人类心理学)];
学科分类号
040202 ;
摘要
Infants can acquire much information by following the gaze direction of others. This type of social learning is underpinned by the ability to understand the relationship between gaze direction and a referent object (i.e., the referential nature of gaze). However, it is unknown whether human gaze is a privileged cue for information that infants use. Comparing human gaze with nonhuman (robot) gaze, we investigated whether infants' understanding of the referential nature of looking is restricted to human gaze. In the current study, we developed a novel task that measured by eye-tracking infants' anticipation of an object from observing an agent's gaze shift. Results revealed that although 10- and 12-month-olds followed the gaze direction of both a human and a robot, only 12-month-olds predicted the appearance of objects from referential gaze information when the agent was the human. Such a prediction for objects reflects an understanding of referential gaze. Our study demonstrates that by 12 months of age, infants hold referential expectations specifically from the gaze shift of humans. These specific expectations from human gaze may enable infants to acquire various information that others convey in social learning and social interaction. (C) 2013 Elsevier Inc. All rights reserved.
引用
收藏
页码:86 / 95
页数:10
相关论文
共 50 条
  • [21] Robot-Human Gaze Behaviour: The Role of Eye Contact and Eye-Gaze Patterns in Human-Robot Interaction (HRI)
    Monasterio Astobiza, Anibal
    Toboso, Mario
    INTERACTIVE ROBOTICS: LEGAL, ETHICAL, SOCIAL AND ECONOMIC ASPECTS, 2022, 30 : 19 - 24
  • [22] The Role of Gaze as a Deictic Cue in Human Robot Interaction
    Yilmaz, Efecan
    Fal, Mehmetcan
    Acarturk, Cengiz
    AUGMENTED COGNITION. HUMAN COGNITION AND BEHAVIOR, AC 2020, PT II, 2020, 12197 : 466 - 478
  • [23] The Developmental Origins of Gaze-Following in Human Infants
    Del Bianco, Teresa
    Falck-Ytter, Terje
    Thorup, Emilia
    Gredeback, Gustaf
    INFANCY, 2019, 24 (03) : 433 - 454
  • [24] Robot Gaze Behavior Affects Honesty in Human-Robot Interaction
    Schellen, Elef
    Bossi, Francesco
    Wykowska, Agnieszka
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2021, 4
  • [25] The Importance of Mutual Gaze in Human-Robot Interaction
    Kompatsiari, Kyveli
    Tikhanoff, Vadim
    Ciardo, Francesca
    Metta, Giorgio
    Wykowska, Agnieszka
    SOCIAL ROBOTICS, ICSR 2017, 2017, 10652 : 443 - 452
  • [26] Active gaze tracking for human-robot interaction
    Atienza, R
    Zelinsky, A
    FOURTH IEEE INTERNATIONAL CONFERENCE ON MULTIMODAL INTERFACES, PROCEEDINGS, 2002, : 261 - 266
  • [27] Towards Automated Human-Robot Mutual Gaze
    Broz, Frank
    Kose-Bagci, Hatice
    Nehaniv, Chrystopher L.
    Dautenhahn, Kerstin
    PROCEEDINGS OF THE FOURTH INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTER-HUMAN INTERACTIONS (ACHI 2011), 2011, : 222 - 227
  • [28] Gaze following in human infants depends on communicative signals
    Senju, Atsushi
    Csibra, Gergely
    CURRENT BIOLOGY, 2008, 18 (09) : 668 - 671
  • [29] Can infants use robot gaze for object learning? The effect of verbalization
    Okumura, Yuko
    Kanakogi, Yasuhiro
    Kanda, Takayuki
    Ishiguro, Hiroshi
    Itakura, Shoji
    INTERACTION STUDIES, 2013, 14 (03) : 351 - 365
  • [30] The Robot in the Room: Influence of Robot Facial Expressions and Gaze on Human-Human-Robot Collaboration
    Fu, Di
    Abawi, Fares
    Wermter, Stefan
    2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 85 - 91