Evaluating the Robustness of an Appearance-based Gaze Estimation Method for Multimodal Interfaces

被引:5
|
作者
Li, Nanxiang [1 ]
Busso, Carlos [1 ]
机构
[1] Univ Texas Dallas, MSP Lab, 800 W Campbell Rd, Richardson, TX 75080 USA
关键词
Gaze estimation; eigenspace analysis; computer user interface; multimodal interfaces;
D O I
10.1145/2522848.2522876
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Given the crucial role of eye movements on visual attention, tracking gaze behaviors is an important research problem in various applications including biometric identification, attention modeling and human-computer interaction. Most of the existing gaze tracking methods require a repetitive system calibration process and are sensitive to the user's head movements. Therefore, they cannot be easily implemented in current multimodal interfaces. This paper investigates an appearance-based approach for gaze estimation that requires minimum calibration and is robust against head motion. The approach consists in building an orthonormal basis, or eigenspace, of the eye appearance with principal component analysis (PCA). Unlike previous studies, we build the eigenspace using image patches displaying both eyes. The projections into the basis are used to train regression models which predict the gaze location. The approach is trained and tested with a new multimodal corpus introduced in this paper. We consider several variables such as the distance between user and the computer monitor, and head movement. The evaluation includes the performance of the proposed gaze estimation system with and without head movement. It also evaluates the results in subject-dependent versus subject-independent conditions under different distances. We report promising results which suggest that the proposed gaze estimation approach is a feasible and flexible scheme to facilitate gaze-based multimodal interfaces.
引用
收藏
页码:91 / 98
页数:8
相关论文
共 50 条
  • [1] Appearance-based eye gaze estimation
    Tan, KH
    Kriegman, DJ
    Ahuja, N
    SIXTH IEEE WORKSHOP ON APPLICATIONS OF COMPUTER VISION, PROCEEDINGS, 2002, : 191 - 195
  • [2] Appearance-Based Gaze Estimation in the Wild
    Zhang, Xucong
    Sugano, Yusuke
    Fritz, Mario
    Bulling, Andreas
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 4511 - 4520
  • [3] Appearance-Based Gaze Estimation for ASD Diagnosis
    Li, Jing
    Chen, Zejin
    Zhong, Yihao
    Lam, Hak-Keung
    Han, Junxia
    Ouyang, Gaoxiang
    Li, Xiaoli
    Liu, Honghai
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (07) : 6504 - 6517
  • [4] Appearance-based Gaze Estimation using Kinect
    Choi, Jinsoo
    Ahn, Byungtae
    Park, Jaesik
    Kweon, In So
    2013 10TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2013, : 260 - 261
  • [5] Appearance-Based Gaze Estimation for Driver Monitoring
    Nikan, Soodeh
    Upadhyay, Devesh
    GAZE MEETS MACHINE LEARNING WORKSHOP, VOL 210, 2022, 210 : 127 - 139
  • [6] Revisiting Data Normalization for Appearance-Based Gaze Estimation
    Zhang, Xucong
    Sugano, Yusuke
    Bulling, Andreas
    2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,
  • [7] Offset Calibration for Appearance-Based Gaze Estimation via Gaze Decomposition
    Chen, Zhaokang
    Shi, Bertram E.
    2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 259 - 268
  • [8] Federated Learning for Appearance-based Gaze Estimation in the Wild
    Elfares, Mayar
    Hu, Zhiming
    Reisert, Pascal
    Bulling, Andreas
    Kuesters, Ralf
    GAZE MEETS MACHINE LEARNING WORKSHOP, VOL 210, 2022, 210 : 20 - 36
  • [9] Towards Eyeglasses Refraction in Appearance-based Gaze Estimation
    Lyu, Junfeng
    Xu, Feng
    2023 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, ISMAR, 2023, : 693 - 702
  • [10] Young-gaze: an appearance-based gaze estimation solution for adolescents
    Lu, Xiaofeng
    Zhao, Zichen
    Ke, Weitao
    Yan, Qingsong
    Liu, Zhi
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (10) : 7145 - 7155