Cognitive Context Detection in UAS Operators Using Eye-Gaze Patterns on Computer Screens

被引:4
|
作者
Mannaru, Pujitha [1 ]
Balasingam, Balakumar [1 ]
Pattipati, Krishna [1 ]
Sibley, Ciara [2 ]
Coyne, Joseph [2 ]
机构
[1] Univ Connecticut, Dept Elect & Comp Engn, 371 Fairfield Way,U-4157, Storrs, CT 06269 USA
[2] Naval Res Lab, Warfighter Human Syst Integrat Lab, 4555 Overlook Ave SW, Washington, DC 20375 USA
来源
NEXT-GENERATION ANALYST IV | 2016年 / 9851卷
关键词
unmanned aerial systems; unmanned aerial vehicles; human computer interaction; operator fatigue detection; cognitive work load; eye-gaze metrics; eye movement metrics;
D O I
10.1117/12.2224184
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Exploring eye-gaze behaviors on emotional states for human-computer interaction (hci) using eye tracking technique
    Thiyagarajan, Sumita
    Amin, Muhamad Kamal M.
    Lecture Notes in Electrical Engineering, 2020, 684 LNEE : 481 - 488
  • [32] Classification approach for understanding implications of emotions using eye-gaze
    Krishnappa Babu, Pradeep Raj
    Lahiri, Uttama
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2020, 11 (07) : 2701 - 2713
  • [33] Sensing and controlling model for eye-gaze input human-computer interface
    Tu, DW
    Zhao, QJ
    Yin, HR
    OPTICAL MODELING AND PERFORMANCE PREDICTIONS, 2003, 5178 : 221 - 228
  • [34] Computer Control and Interaction Using Eye Gaze Direction Detection
    Yilmaz, Cagatay Murat
    Kose, Cemal
    2014 22ND SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2014, : 1658 - 1661
  • [35] A non-contact eye-gaze tracking system for human computer interaction
    Qi, Ying
    Wang, Zhi-Liang
    Huang, Ying
    2007 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION, VOLS 1-4, PROCEEDINGS, 2007, : 68 - 72
  • [36] Eye-gaze estimation by using features irrespective of face direction
    Miyake, T., 1600, John Wiley and Sons Inc. (36):
  • [37] A support system for mouse operations using eye-gaze input
    Kiyohiko, Abe
    Yasuhiro, Nakayama
    Shoichi, Ohi
    Minoru, Ohyama
    IEEJ Transactions on Electronics, Information and Systems, 2009, 129 (09) : 1705 - 1713
  • [38] Human-computer interaction models and their application in an eye-gaze input system
    Zhao, QJ
    Tu, DW
    Gao, DM
    Wang, RS
    PROCEEDINGS OF THE 2004 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2004, : 2274 - 2278
  • [39] A Control System for Electrical Appliances using Eye-gaze Input
    Nakayama, Hidenobu
    Yabuki, Noboru
    Inoue, Hiroyuki
    Sumi, Yasuaki
    Tsukutani, Takao
    IEEE INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATIONS SYSTEMS (ISPACS 2012), 2012,
  • [40] Eye-gaze interfaces using electro-oculography (EOG)
    Tokyo Institute of Technology, 2-12-1 O-okayama, Meguro-ku Tokyo 152-8552, Japan
    Int Conf Intell User Interfaces Proc IUI, 1600, (28-32):