The amplitude of small eye movements can be accurately estimated with video-based eye trackers

被引:4
|
作者
Nystrom, Marcus [1 ]
Niehorster, Diederick C. [1 ,2 ]
Andersson, Richard [3 ]
Hessels, Roy S. [4 ]
Hooge, Ignace T. C. [4 ]
机构
[1] Lund Univ, Humanities Lab, Box 201, SE-22100 Lund, Sweden
[2] Lund Univ, Dept Psychol, Box 201, SE-22100 Lund, Sweden
[3] Tobii Pro AB, Box 743, SE-18217 Danderyd, Sweden
[4] Univ Utrecht, Helmholtz Inst, Expt Psychol, Heidelberglaan 1, NL-3584 CS Utrecht, Netherlands
关键词
Microsaccades; EyeLink; 1000; plus; Image resolution; PUPIL;
D O I
10.3758/s13428-021-01780-6
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Estimating the gaze direction with a digital video-based pupil and corneal reflection (P-CR) eye tracker is challenging partly since a video camera is limited in terms of spatial and temporal resolution, and because the captured eye images contain noise. Through computer simulation, we evaluated the localization accuracy of pupil-, and CR centers in the eye image for small eye rotations (MUCH LESS-THAN 1 deg). Results highlight how inaccuracies in center localization are related to 1) how many pixels the pupil and CR span in the eye camera image, 2) the method to compute the center of the pupil and CRs, and 3) the level of image noise. Our results provide a possible explanation to why the amplitude of small saccades may not be accurately estimated by many currently used video-based eye trackers. We conclude that eye movements with arbitrarily small amplitudes can be accurately estimated using the P-CR eye-tracking principle given that the level of image noise is low and the pupil and CR span enough pixels in the eye camera, or if localization of the CR is based on the intensity values in the eye image instead of a binary representation.
引用
收藏
页码:657 / 669
页数:13
相关论文
共 50 条
  • [21] A review of eye tracking research on video-based learning
    Deng, Ruiqi
    Gao, Yifan
    EDUCATION AND INFORMATION TECHNOLOGIES, 2023, 28 (06) : 7671 - 7702
  • [22] A model-based approach to video-based eye tracking
    Li, Feng
    Munn, Susan
    Pelz, Jeff
    JOURNAL OF MODERN OPTICS, 2008, 55 (4-5) : 503 - 531
  • [23] Dummy eye measurements of microsaccades: Testing the influence of system noise and head movements on microsaccade detection in a popular video-based eye tracker
    Hermens, Frouke
    JOURNAL OF EYE MOVEMENT RESEARCH, 2015, 8 (01):
  • [24] Potential of a laser pointer contact lens to improve the reliability of video-based eye-trackers in indoor and outdoor conditions
    Robert, Francois-Mael
    Nourrit, Vincent
    Otheguy, Marion
    de la Tocnaye, Jean-Louis de Bougrenet
    JOURNAL OF EYE MOVEMENT RESEARCH, 2024, 17 (01): : 1 - 16
  • [25] The pupil is faster than the corneal reflection (CR): Are video based pupil-CR eye trackers suitable for studying detailed dynamics of eye movements?
    Hooge, Ignace
    Holmqvist, Kenneth
    Nystrom, Marcus
    VISION RESEARCH, 2016, 128 : 6 - 18
  • [26] SacCalib: Reducing Calibration Distortion for Stationary Eye Trackers Using Saccadic Eye Movements
    Huang, Michael Xuelin
    Bulling, Andreas
    ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2019,
  • [27] Video-based eye tracking - Our experience with advanced stimuli design for eye tracking software
    Rufa, A
    Mariottini, GL
    Prattichizzo, D
    Alessandrini, D
    Vicino, A
    Federico, A
    CLINICAL AND BASIC OCULOMOTOR RESEARCH: IN HONOR OF DAVID S. ZEE, 2005, 1039 : 575 - 579
  • [28] Simultaneous eye tracking with a Tracking Scanning Laser Ophthalmoscope (TSLO) and a video-based eye tracker
    Prahalad, Krishnamachari S.
    Coates, Daniel R.
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2020, 61 (07)
  • [29] A nonvisual eye tracker calibration method for video-based tracking
    Harrar, Vanessa
    Le Trung, William
    Malienko, Anton
    Khan, Aarlenne Zein
    JOURNAL OF VISION, 2018, 18 (09): : 1 - 11
  • [30] Using structured illumination to enhance video-based eye tracking
    Li, Feng
    Kolakowski, Susan
    Pelz, Jeff
    2007 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-7, 2007, : 373 - 376