Eye-tracking of visual attention in web-based assessment using the Force Concept Inventory

被引:22
|
作者
Han, Jing [1 ]
Chen, Li [1 ,2 ]
Fu, Zhao [1 ]
Fritchman, Joseph [1 ]
Bao, Lei [1 ]
机构
[1] Ohio State Univ, Dept Phys, 174 W 18th Ave, Columbus, OH 43210 USA
[2] Univ Sci & Technol China, Nano Sci & Technol Inst, Suzhou, Peoples R China
关键词
eye tracking; FCI; mechanics concepts; conceptual learning; physics education;
D O I
10.1088/1361-6404/aa6c49
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
This study used eye-tracking technology to investigate students' visual attention while taking the Force Concept Inventory (FCI) in a web-based interface. Eighty nine university students were randomly selected into a pretest group and a post-test group. Students took the 30-question FCI on a computer equipped with an eye-tracker. There were seven weeks of instruction between the pre- and post-test data collection. Students' performance on the FCI improved significantly from pre- test to post-test. Meanwhile, the eye-tracking results reveal that the time students spent on taking the FCI test was not affected by student performance and did not change from pre- test to post-test. Analysis of students' attention to answer choices shows that on the pretest students primarily focused on the naive choices and ignored the expert choices. On the post-test, although students had shifted their primary attention to the expert choices, they still kept a high level of attention to the naive choices, indicating significant conceptual mixing and competition during problem solving. Outcomes of this study provide new insights on students' conceptual development in learning physics.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Gender Modulates Visual Attention to Emotional Faces: An Eye-Tracking Study
    Zhang, Ludan
    Wang, Junling
    Xue, Huiqin
    Liu, Shuang
    Ming, Dong
    12TH ASIAN-PACIFIC CONFERENCE ON MEDICAL AND BIOLOGICAL ENGINEERING, VOL 1, APCMBE 2023, 2024, 103 : 281 - 288
  • [22] Visual Attention to and Understanding of Graphic Program Advisories: An Eye-Tracking Study
    Cummins, R. Glenn
    Stone, Cam H.
    Gong, Zijian
    Cui, Boni
    JOURNAL OF BROADCASTING & ELECTRONIC MEDIA, 2017, 61 (04) : 703 - 722
  • [23] Predictors of visual attention to climate change images: An eye-tracking study
    Sollberger, Silja
    Bernauer, Thomas
    Ehlert, Ulrike
    JOURNAL OF ENVIRONMENTAL PSYCHOLOGY, 2017, 51 : 46 - 56
  • [24] Impact of insurance on visual attention to risk and landscape: An eye-tracking study
    Wu, Renxian
    SOCIAL BEHAVIOR AND PERSONALITY, 2021, 49 (05):
  • [25] Visual attention in open learner model presentations: An eye-tracking investigation
    Bull, Susan
    Cooke, Neil
    Mabbott, Andrew
    USER MODELING 2007, PROCEEDINGS, 2007, 4511 : 177 - +
  • [26] An Eye-Tracking Dataset for Visual Attention Modelling in a Virtual Museum Context
    Zhou, Yunzhan
    Feng, Tian
    Shuai, Shihui
    Li, Xiangdong
    Sun, Lingyun
    Duh, Henry B. L.
    17TH ACM SIGGRAPH INTERNATIONAL CONFERENCE ON VIRTUAL-REALITY CONTINUUM AND ITS APPLICATIONS IN INDUSTRY (VRCAI 2019), 2019,
  • [27] AUDIO-VISUAL ATTENTION: EYE-TRACKING DATASET AND ANALYSIS TOOLBOX
    Marighetto, Pierre
    Coutrot, Antoine
    Riche, Nicolas
    Guyader, Nathalie
    Mancas, Matei
    Gosselin, Bernard
    Laganiere, Robert
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 1802 - 1806
  • [28] Investigation of Web-Based Eye-Tracking System Performance under Different Lighting Conditions for Neuromarketing
    Yuksel, Dogus
    JOURNAL OF THEORETICAL AND APPLIED ELECTRONIC COMMERCE RESEARCH, 2023, 18 (04): : 2092 - 2106
  • [29] Keeping an eye on pain: investigating visual attention biases in individuals with chronic pain using eye-tracking methodology
    Fashler, Samantha R.
    Katz, Joel
    JOURNAL OF PAIN RESEARCH, 2016, 9 : 551 - 561
  • [30] Joint Attention Simulation Using Eye-Tracking and Virtual Humans
    Courgeon, Matthieu
    Rautureau, Gilles
    Martin, Jean-Claude
    Grynszpan, Ouriel
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2014, 5 (03) : 238 - 250