An interactive eye-tracking system for measuring radiologists' visual fixations in volumetric CT images: Implementation and initial eye-tracking accuracy validation

被引:8
|
作者
Gong, Hao [1 ]
Hsieh, Scott S. [1 ]
Holmes, David R., III
Cook, David A.
Inoue, Akitoshi [1 ]
Bartlett, David J. [1 ,2 ,3 ]
Baffour, Francis [1 ]
Takahashi, Hiroaki [1 ]
Leng, Shuai [1 ]
Yu, Lifeng [1 ]
McCollough, Cynthia H. [1 ]
Fletcher, Joel G. [1 ]
机构
[1] Mayo Clin, Dept Radiol, Rochester, MN 55901 USA
[2] Mayo Clin, Dept Physiol & Biomed Engn, Rochester, MN USA
[3] Mayo Clin, Dept Internal Med, Rochester, MN USA
基金
美国国家卫生研究院;
关键词
biofeedback; computed tomography; eye tracking; observer performance; OBSERVER PERFORMANCE; EXPERTISE; GAZE;
D O I
10.1002/mp.15219
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Purpose Eye-tracking approaches have been used to understand the visual search process in radiology. However, previous eye-tracking work in computer tomography (CT) has been limited largely to single cross-sectional images or video playback of the reconstructed volume, which do not accurately reflect radiologists' visual search activities and their interactivity with three-dimensional image data at a computer workstation (e.g., scroll, pan, and zoom) for visual evaluation of diagnostic imaging targets. We have developed a platform that integrates eye-tracking hardware with in-house-developed reader workstation software to allow monitoring of the visual search process and reader-image interactions in clinically relevant reader tasks. The purpose of this work is to validate the spatial accuracy of eye-tracking data using this platform for different eye-tracking data acquisition modes. Methods An eye-tracker was integrated with a previously developed workstation designed for reader performance studies. The integrated system captured real-time eye movement and workstation events at 1000 Hz sampling frequency. The eye-tracker was operated either in head-stabilized mode or in free-movement mode. In head-stabilized mode, the reader positioned their head on a manufacturer-provided chinrest. In free-movement mode, a biofeedback tool emitted an audio cue when the head position was outside the data collection range (general biofeedback) or outside a narrower range of positions near the calibration position (strict biofeedback). Four radiologists and one resident were invited to participate in three studies to determine eye-tracking spatial accuracy under three constraint conditions: head-stabilized mode (i.e., with use of a chin rest), free movement with general biofeedback, and free movement with strict biofeedback. Study 1 evaluated the impact of head stabilization versus general or strict biofeedback using a cross-hair target prior to the integration of the eye-tracker with the image viewing workstation. In Study 2, after integration of the eye-tracker and reader workstation, readers were asked to fixate on targets that were randomly distributed within a volumetric digital phantom. In Study 3, readers used the integrated system to scroll through volumetric patient CT angiographic images while fixating on the centerline of designated blood vessels (from the left coronary artery to dorsalis pedis artery). Spatial accuracy was quantified as the offset between the center of the intended target and the detected fixation using units of image pixels and the degree of visual angle. Results The three head position constraint conditions yielded comparable accuracy in the studies using digital phantoms. For Study 1 involving the digital crosshairs, the median +/- the standard deviation of offset values among readers were 15.2 +/- 7.0 image pixels with the chinrest, 14.2 +/- 3.6 image pixels with strict biofeedback, and 19.1 +/- 6.5 image pixels with general biofeedback. For Study 2 using the random dot phantom, the median +/- standard deviation offset values were 16.7 +/- 28.8 pixels with use of a chinrest, 16.5 +/- 24.6 pixels using strict biofeedback, and 18.0 +/- 22.4 pixels using general biofeedback, which translated to a visual angle of about 0.8 degrees for all three conditions. We found no obvious association between eye-tracking accuracy and target size or view time. In Study 3 viewing patient images, use of the chinrest and strict biofeedback demonstrated comparable accuracy, while the use of general biofeedback demonstrated a slightly worse accuracy. The median +/- standard deviation of offset values were 14.8 +/- 11.4 pixels with use of a chinrest, 21.0 +/- 16.2 pixels using strict biofeedback, and 29.7 +/- 20.9 image pixels using general biofeedback. These corresponded to visual angles ranging from 0.7 degrees to 1.3 degrees. Conclusions An integrated eye-tracker system to assess reader eye movement and interactive viewing in relation to imaging targets demonstrated reasonable spatial accuracy for assessment of visual fixation. The head-free movement condition with audio biofeedback performed similarly to head-stabilized mode.
引用
收藏
页码:6710 / 6723
页数:14
相关论文
共 50 条
  • [1] Implementation and initial experience with an interactive eye-tracking system for measuring radiologists' visual search in diagnostic tasks using volumetric CT images
    Gong, Hao
    Hsieh, Scott S.
    Holmes, David R.
    Cook, David A.
    Inoue, Akitoshi
    Bartlett, David J.
    Baffour, Francis
    Takahashi, Hiroaki
    Leng, Shuai
    Yu, Lifeng
    Fletcher, Joel G.
    McCollough, Cynthia H.
    MEDICAL IMAGING 2022: PHYSICS OF MEDICAL IMAGING, 2022, 12031
  • [2] Eye-Tracking in Interactive Virtual Environments: Implementation and Evaluation
    Ugwitz, Pavel
    Kvarda, Ondrej
    Jurikova, Zuzana
    Sasinka, Cenek
    Tamm, Sascha
    APPLIED SCIENCES-BASEL, 2022, 12 (03):
  • [3] Design and validation of a simple eye-tracking system
    Liston, Dorion B.
    Simpson, Sol
    Wone, Lily R.
    Rich, Mark
    Stone, Leland S.
    2016 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2016), 2016, : 221 - 224
  • [4] Smart Eye-Tracking System
    Juhong, Aniwat
    Treebupachatsakul, T.
    Pintavirooj, C.
    2018 INTERNATIONAL WORKSHOP ON ADVANCED IMAGE TECHNOLOGY (IWAIT), 2018,
  • [5] Eye-tracking of nodule detection in lung CT volumetric data
    Diaz, Ivan
    Schmidt, Sabine
    Verdun, Francis R.
    Bochud, Francois O.
    MEDICAL PHYSICS, 2015, 42 (06) : 2925 - 2932
  • [6] Strategic gaze: an interactive eye-tracking study
    Hausfeld, J.
    von Hesler, K.
    Goldluecke, S.
    EXPERIMENTAL ECONOMICS, 2021, 24 (01) : 177 - 205
  • [7] Strategic gaze: an interactive eye-tracking study
    J. Hausfeld
    K. von Hesler
    S. Goldlücke
    Experimental Economics, 2021, 24 : 177 - 205
  • [8] Image processing for improved eye-tracking accuracy
    Mulligan, JB
    BEHAVIOR RESEARCH METHODS INSTRUMENTS & COMPUTERS, 1997, 29 (01): : 54 - 65
  • [9] Image processing for improved eye-tracking accuracy
    Jeffrey B. Mulligan
    Behavior Research Methods, Instruments, & Computers, 1997, 29 : 54 - 65
  • [10] Measuring fraction comparison strategies with eye-tracking
    Obersteiner A.
    Tumpek C.
    ZDM, 2016, 48 (3): : 255 - 266