Estimation of reading subjective understanding based on eye gaze analysis

被引:5
|
作者
Sanches, Charles Lima [1 ]
Augereau, Olivier [1 ]
Kise, Koichi [1 ]
机构
[1] Osaka Prefecture Univ, Sakai, Osaka, Japan
来源
PLOS ONE | 2018年 / 13卷 / 10期
关键词
CONFIDENCE; MOVEMENTS;
D O I
10.1371/journal.pone.0206213
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
The integration of ubiquitous technologies in the field of education has considerably enhanced our way of learning. Such technologies enable students to get a gradual feedback about their performance and to provide adapted learning materials. It is particularly important in the domain of foreign language learning which requires intense daily practice. One of the main inputs of adaptive learning systems is the user's understanding of a reading material. The reader's understanding can be divided into two parts: the objective understanding and the subjective understanding. The objective understanding can be measured by comprehension questions about the content of the text. The subjective understanding is the reader's perception of his own understanding. The subjective understanding plays an important role in the reader's motivation, self-esteem and confidence. However, its automatic estimation remains a challenging task. This paper is one of the first to propose a method to estimate the subjective understanding. We show that using the eye gaze to predict the subjective understanding improves the estimation by 13% as compared to using comprehension questions.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Automatic Eye Gaze Estimation using Geometric & Texture-based Networks
    Jyoti, Shreyank
    Dhall, Abhinav
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2474 - 2479
  • [42] Estimation of Confidence Based on Eye Gaze: an Application to Multiple-choice Questions
    Yamada, Kento
    Augereau, Olivier
    Kise, Koichi
    PROCEEDINGS OF THE 2017 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING AND PROCEEDINGS OF THE 2017 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS (UBICOMP/ISWC '17 ADJUNCT), 2017, : 217 - 220
  • [43] Evaluating fMRI-Based Estimation of Eye Gaze During Naturalistic Viewing
    Son, Jake
    Ai, Lei
    Lim, Ryan
    Xu, Ting
    Colcombe, Stanley
    Franco, Alexandre Rosa
    Cloud, Jessica
    LaConte, Stephen
    Lisinski, Jonathan
    Klein, Arno
    Craddock, R. Cameron
    Milham, Michael
    CEREBRAL CORTEX, 2020, 30 (03) : 1171 - 1184
  • [44] Estimation of eye-gaze direction and silent talk based on biological signal
    Hirose, Hideaki
    Koike, Yasuharu
    NEUROSCIENCE RESEARCH, 2010, 68 : E214 - E215
  • [45] The role of eye-gaze in understanding other minds
    Pellicano, E
    Rhodes, G
    BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY, 2003, 21 : 33 - 43
  • [46] Eye Gaze Estimation Based on Ellipse Fitting and Three-Dimensional Model of Eye for "Intelligent Poster"
    Urano, Roma
    Suzuki, Ryuji
    Sasaki, Takeshi
    2014 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2014, : 1157 - 1162
  • [47] Range image sensor based eye gaze estimation by using the relationship between the face and eye directions
    Tamaki H.
    Sakai T.
    Ota Y.
    Kusunoki F.
    Inagaki S.
    Egusa R.
    Sugimoto M.
    Mizoguchi H.
    1600, Massey University (09): : 2297 - 2308
  • [48] A Hierarchical Generative Model for Eye linage Synthesis and Eye Gaze Estimation
    Wang, Kang
    Zhao, Rui
    Ji, Qiang
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 440 - 448
  • [49] Understanding Mobile Reading via Camera Based Gaze Tracking and Kinematic Touch Modeling
    Guo, Wei
    Wang, Jingtao
    ICMI'18: PROCEEDINGS OF THE 20TH ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2018, : 288 - 297
  • [50] EM-Gaze: eye context correlation and metric learning for gaze estimation
    Zhou, Jinchao
    Li, Guoan
    Shi, Feng
    Guo, Xiaoyan
    Wan, Pengfei
    Wang, Miao
    VISUAL COMPUTING FOR INDUSTRY BIOMEDICINE AND ART, 2023, 6 (01)