Cognitive Context Detection in UAS Operators Using Eye-Gaze Patterns on Computer Screens

被引:4
|
作者
Mannaru, Pujitha [1 ]
Balasingam, Balakumar [1 ]
Pattipati, Krishna [1 ]
Sibley, Ciara [2 ]
Coyne, Joseph [2 ]
机构
[1] Univ Connecticut, Dept Elect & Comp Engn, 371 Fairfield Way,U-4157, Storrs, CT 06269 USA
[2] Naval Res Lab, Warfighter Human Syst Integrat Lab, 4555 Overlook Ave SW, Washington, DC 20375 USA
来源
NEXT-GENERATION ANALYST IV | 2016年 / 9851卷
关键词
unmanned aerial systems; unmanned aerial vehicles; human computer interaction; operator fatigue detection; cognitive work load; eye-gaze metrics; eye movement metrics;
D O I
10.1117/12.2224184
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Eye-Gaze Detection with a Single WebCAM Based on Geometry Features Extraction
    Nguyen Huu Cuong
    Huynh Thai Hoang
    11TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV 2010), 2010, : 2507 - 2512
  • [42] Classification approach for understanding implications of emotions using eye-gaze
    Pradeep Raj Krishnappa Babu
    Uttama Lahiri
    Journal of Ambient Intelligence and Humanized Computing, 2020, 11 : 2701 - 2713
  • [43] Activity recognition using eye-gaze movements and traditional interactions
    Courtemanche, Francois
    Aimeur, Esma
    Dufresne, Aude
    Najjar, Mehdi
    Mpondo, Franck
    INTERACTING WITH COMPUTERS, 2011, 23 (03) : 202 - 213
  • [44] Eye-Gaze and Mouse-Movements on Web Search as Indicators of Cognitive Impairment
    Gwizdka, Jacek
    Tessmer, Rachel
    Chan, Yao-Cheng
    Radhakrishnan, Kavita
    Henry, Maya L.
    INFORMATION SYSTEMS AND NEUROSCIENCE, NEUROIS RETREAT 2022, 2022, 58 : 187 - 200
  • [45] Eye-gaze patterns as students study worked-out examples in mechanics
    Smith, Adam D.
    Mestre, Jose P.
    Ross, Brian H.
    PHYSICAL REVIEW SPECIAL TOPICS-PHYSICS EDUCATION RESEARCH, 2010, 6 (02):
  • [46] Robot-Human Gaze Behaviour: The Role of Eye Contact and Eye-Gaze Patterns in Human-Robot Interaction (HRI)
    Monasterio Astobiza, Anibal
    Toboso, Mario
    INTERACTIVE ROBOTICS: LEGAL, ETHICAL, SOCIAL AND ECONOMIC ASPECTS, 2022, 30 : 19 - 24
  • [47] Eye-gaze independent EEG-based brain-computer interfaces for communication
    Riccio, A.
    Mattia, D.
    Simione, L.
    Olivetti, M.
    Cincotti, F.
    JOURNAL OF NEURAL ENGINEERING, 2012, 9 (04)
  • [48] An Open Conversation on Using Eye-Gaze Methods in Studies of Neurodevelopmental Disorders
    Venker, Courtney E.
    Kover, Sara T.
    JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2015, 58 (06): : 1719 - 1732
  • [49] Eye-controlled mouse based on eye-gaze tracking using one camera
    Liu Ruian
    Jin Shijiu
    Wu Xiaorong
    PROCEEDINGS OF THE FIRST INTERNATIONAL SYMPOSIUM ON TEST AUTOMATION & INSTRUMENTATION, VOLS 1 - 3, 2006, : 773 - 776
  • [50] The effects of social and cognitive cues on learning comprehension, eye-gaze pattern, and cognitive load in video instruction
    Jewoong Moon
    Jeeheon Ryu
    Journal of Computing in Higher Education, 2021, 33 : 39 - 63