Gaze in a real-world social interaction: A dual eye-tracking study

被引:43
|
作者
Macdonald, Ross G. [1 ]
Tatler, Benjamin W. [2 ]
机构
[1] Univ Manchester, Sch Hlth Sci, Manchester M13 9PL, Lancs, England
[2] Univ Aberdeen, Sch Psychol, Aberdeen, Scotland
来源
基金
英国工程与自然科学研究理事会;
关键词
Social attention; eye movements; joint attention; social interaction; gaze cues; ATTENTION; BEHAVIOR; HYPOTHESIS; MOVEMENTS; CONTACT; LOOKING;
D O I
10.1177/1747021817739221
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
People communicate using verbal and non-verbal cues, including gaze cues. Gaze allocation can be influenced by social factors; however, most research on gaze cueing has not considered these factors. The presence of social roles was manipulated in a natural, everyday collaborative task while eye movements were measured. In pairs, participants worked together to make a cake. Half of the pairs were given roles (Chef or Gatherer) and the other half were not. Across all participants we found, contrary to the results of static-image experiments, that participants spent very little time looking at each other, challenging the generalisability of the conclusions from lab-based paradigms. However, participants were more likely than not to look at their partner when receiving an instruction, highlighting the typical coordination of gaze cues and verbal communication in natural interactions. The mean duration of instances in which the partners looked at each other (partner gaze) was longer in the roles condition, and these participants were quicker to align their gaze with their partners (shared gaze). In addition, we found some indication that when hearing spoken instructions, listeners in the roles condition looked at the speaker more than listeners in the no roles condition. We conclude that social context can affect our gaze behaviour during a social interaction.
引用
收藏
页码:2162 / 2173
页数:12
相关论文
共 50 条
  • [31] The Relationship between Eye Gaze and Verb Agreement in American Sign Language: An Eye-tracking Study
    Robin Thompson
    Karen Emmorey
    Robert Kluender
    Natural Language & Linguistic Theory, 2006, 24 : 571 - 604
  • [32] How charisma shapes a leader's gaze behavior: An eye-tracking study
    Maran, Thomas
    Liegl, Simon
    Furtner, Marco
    Ravet-Brown, Theo
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2023, 58 : 683 - 683
  • [33] Effect of Subtitles on Gaze Behavior during Shot Changes: An Eye-tracking Study
    Joy, Jeril
    Padakannaya, Prakash
    INTERNATIONAL JOURNAL OF PSYCHOLOGICAL RESEARCH, 2023, 16 (02): : 4 - 13
  • [34] The relationship between eye gaze and verb agreement in American sign language: An eye-tracking study
    Thompson, R
    Emmorey, K
    Kluender, R
    NATURAL LANGUAGE & LINGUISTIC THEORY, 2006, 24 (02) : 571 - 604
  • [35] Reward and Cue Effects on Orientation Judgements: A Gaze Contingent Eye-Tracking Study
    Anderson, Britt
    Marsh, Christie Haskell
    PERCEPTION, 2019, 48 : 216 - 216
  • [36] Infant responses to direct gaze and associations to autism: A live eye-tracking study
    Rudling, Maja
    Nystrom, Par
    Bussu, Giorgia
    Bolte, Sven
    Falck-Ytter, Terje
    AUTISM, 2024, 28 (07) : 1677 - 1689
  • [37] Eye to eye contact in social anxiety:: New insights from eye-tracking and psychophysiological data in a mutual gaze design
    Wieser, Matthias J.
    Muehlberger, Andreas
    Pauli, Paul
    PSYCHOPHYSIOLOGY, 2007, 44 : S35 - S35
  • [38] Simulating interaction: Using gaze-contingent eye-tracking to measure the reward value of social signals in toddlers with and without autism
    Vernetti, Angelina
    Senju, Atsushi
    Charman, Tony
    Johnson, Mark H.
    Gliga, Teodora
    DEVELOPMENTAL COGNITIVE NEUROSCIENCE, 2018, 29 : 21 - 29
  • [39] Tracking the reading eye: towards a model of real-world reading
    Jarodzka, H.
    Brand-Gruwel, S.
    JOURNAL OF COMPUTER ASSISTED LEARNING, 2017, 33 (03) : 193 - 201
  • [40] CrowdEyes: Crowdsourcing for Robust Real-World Mobile Eye Tracking
    Othman, Mohammad
    Amaral, Telmo
    McNaney, Roisin
    Smeddinck, Jan D.
    Vines, John
    Olivier, Patrick
    PROCEEDINGS OF THE 19TH INTERNATIONAL CONFERENCE ON HUMAN-COMPUTER INTERACTION WITH MOBILE DEVICES AND SERVICES (MOBILEHCI '17), 2017,