Real-Time Conversational Gaze Synthesis for Avatars

被引:3
|
作者
Canales, Ryan [1 ]
Jain, Eakta [2 ]
Joerg, Sophie [1 ,3 ]
机构
[1] Clemson Univ, Clemson, SC 29634 USA
[2] Univ Florida, Gainesville, FL USA
[3] Univ Bamberg, Bamberg, Germany
来源
15TH ANNUAL ACM SIGGRAPH CONFERENCE ON MOTION, INTERACTION AND GAMES, MIG 2023 | 2023年
基金
美国国家科学基金会;
关键词
gaze animation; avatars; motion perception; virtual reality; EYE GAZE; MODEL;
D O I
10.1145/3623264.3624446
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Eye movement plays an important role in face-to-face communication. In this work, we present a deep learning approach for synthesizing the eye movements of avatars for two-party conversations and evaluate viewer perception of different types of eye motions. We aim to synthesize believable gaze behavior based on head motions and audio features as they would typically be available in virtual reality applications. To this end, we captured the head motion, eye motion, and audio of several two-party conversations and trained an RNN-based model to predict where an avatar looks in a two-person conversational scenario. We evaluated our approach with a user study on the perceived quality of the eye animation and compared our method with other eye animation methods. While our model was not rated highest, our model and our user study lead to a series of insights on model features, viewer perception, and study design that we present.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] Real-time algorithms for head mounted gaze tracker
    Starostenko, Aleksey
    Kozin, Filipp
    Gorbachev, Roman
    2019 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE: APPLICATIONS AND INNOVATIONS (IC-AIAI 2019), 2019, : 86 - 89
  • [22] (Simulated) listener gaze in real-time spoken interaction
    Fraedrich, Laura
    Nunnari, Fabrizio
    Staudte, Maria
    Heloir, Alexis
    COMPUTER ANIMATION AND VIRTUAL WORLDS, 2018, 29 (3-4)
  • [23] A Low Complexity Method for Real-Time Gaze Tracking
    Zhang, Jing
    Zhao, Mengkai
    Zhuo, Li
    Shen, Lansun
    2008 IEEE 10TH WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, VOLS 1 AND 2, 2008, : 887 - 890
  • [24] Real-time facial and eye gaze tracking system
    Park, KR
    Kim, J
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2005, E88D (06): : 1231 - 1238
  • [25] Motion is Enough: How Real-Time Avatars Improve Distant Communication
    Tanaka, Kazuaki
    Onoue, Satoshi
    Nakanishi, Hideyuki
    Ishiguro, Hiroshi
    PROCEEDINGS OF THE 2013 INTERNATIONAL CONFERENCE ON COLLABORATION TECHNOLOGIES AND SYSTEMS (CTS), 2013, : 465 - 472
  • [26] Exercising at home: Real-time interaction and experience sharing using avatars
    Cui, Jingyu
    Aghajan, Yasmin
    Lacroix, Joyca
    van Halteren, Aart
    Aghajan, Hamid
    ENTERTAINMENT COMPUTING, 2009, 1 (02) : 63 - 73
  • [27] The impact of online real-time interactivity on patronage intention: The use of avatars
    Etemad-Sajadi, Reza
    COMPUTERS IN HUMAN BEHAVIOR, 2016, 61 : 227 - 232
  • [28] Real-time Facial Animation with Image-based Dynamic Avatars
    Cao, Chen
    Wu, Hongzhi
    Weng, Yanlin
    Shao, Tianjia
    Zhou, Kun
    ACM TRANSACTIONS ON GRAPHICS, 2016, 35 (04):
  • [29] Developing a Real-Time Test to Investigate Conversational Speech Understanding
    Buchholz, Joerg M.
    Davis, Chris
    Beadle, Julie
    Kim, Jeesun
    JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2022, 65 (12): : 4520 - 4538
  • [30] AQuA: Automatic Quality Analysis of Conversational Scripts in Real-time
    Abhinav, Kumar
    Dubey, Alpana
    Jain, Sakshi
    Arora, Veenu
    Puttaveerana, Asha
    Miller, Susan
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2019, PT II, 2019, 11509 : 489 - 500