Classification approach for understanding implications of emotions using eye-gaze

被引:0
|
作者
Pradeep Raj Krishnappa Babu
Uttama Lahiri
机构
[1] Indian Institute of Technology Gandhinagar,Center for Cognitive Science
[2] Indian Institute of Technology Gandhinagar,Electrical Engineering Department
关键词
Autism; Virtual reality; Eye-tracking; Fixation duration; Pupil diameter; Blink rate; Classification; SVM;
D O I
暂无
中图分类号
学科分类号
摘要
Atypical behavioral viewing pattern is one of the core deficits of individuals with Autism Spectrum Disorder (ASD). This diminishes their ability to understand the communicator’s facial emotional expression and often misinterpret one’s intended emotion. Here, we investigated the feasibility of using one’s gaze-related indices to estimate distinctive changes corresponding to various emotions. We designed a usability study with nine individuals with ASD and Typically Developing (TD) individuals who were exposed to Virtual Reality (VR) based social scenarios. The VR scenes presented virtual characters who narrated their social experience in the form of short stories with context-relevant emotional expressions. Simultaneously, we collected one’s gaze-related physiological indices (PIs) and behavioral looking pattern indices (BIs) using a technologically-enhanced eye-tracker. Subsequently, these PIs and BIs were used to classify the implications of the emotional expressions both within and across the ASD and TD groups. Results of the usability study indicate that one’s gaze-related indices can be discriminated with 97% accuracy for various emotions for intra-group analysis and 100% accuracy for inter-group analysis.
引用
收藏
页码:2701 / 2713
页数:12
相关论文
共 50 条
  • [21] Eye-gaze interfaces using electro-oculography (EOG)
    Tokyo Institute of Technology, 2-12-1 O-okayama, Meguro-ku Tokyo 152-8552, Japan
    Int Conf Intell User Interfaces Proc IUI, 1600, (28-32):
  • [22] Activity recognition using eye-gaze movements and traditional interactions
    Courtemanche, Francois
    Aimeur, Esma
    Dufresne, Aude
    Najjar, Mehdi
    Mpondo, Franck
    INTERACTING WITH COMPUTERS, 2011, 23 (03) : 202 - 213
  • [23] HUMAN-COMPUTER INTERACTION USING EYE-GAZE INPUT
    HUTCHINSON, TE
    WHITE, KP
    MARTIN, WN
    REICHERT, KC
    FREY, LA
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1989, 19 (06): : 1527 - 1534
  • [24] Eye-Gaze Tracking based Interaction in India
    Biswas, Pradipta
    Langdon, Pat
    6TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN COMPUTER INTERACTION, IHCI 2014, 2014, 39 : 59 - 66
  • [25] A System for Web Browsing by Eye-Gaze Input
    Abe, Kiyohiko
    Owada, Kosuke
    Ohi, Shoichi
    Ohyama, Minoru
    ELECTRONICS AND COMMUNICATIONS IN JAPAN, 2008, 91 (05) : 11 - 18
  • [26] Eye-gaze orienting to auditory and tactile targets
    Soto-Faraco, S
    Kingstone, A
    JOURNAL OF PSYCHOPHYSIOLOGY, 2005, 19 (01) : 61 - 61
  • [27] An Open Conversation on Using Eye-Gaze Methods in Studies of Neurodevelopmental Disorders
    Venker, Courtney E.
    Kover, Sara T.
    JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2015, 58 (06): : 1719 - 1732
  • [28] Eye-controlled mouse based on eye-gaze tracking using one camera
    Liu Ruian
    Jin Shijiu
    Wu Xiaorong
    PROCEEDINGS OF THE FIRST INTERNATIONAL SYMPOSIUM ON TEST AUTOMATION & INSTRUMENTATION, VOLS 1 - 3, 2006, : 773 - 776
  • [29] Configural processing in the perception of eye-gaze direction
    Jenkins, J
    Langton, SRH
    PERCEPTION, 2003, 32 (10) : 1181 - 1188
  • [30] Feasibility of Longitudinal Eye-Gaze Tracking in the Workplace
    Hutt S.
    Stewart A.E.B.
    Gregg J.
    Mattingly S.
    D'mello S.K.
    Proceedings of the ACM on Human-Computer Interaction, 2022, 6 (ETRA)