Classification approach for understanding implications of emotions using eye-gaze

被引:0
|
作者
Pradeep Raj Krishnappa Babu
Uttama Lahiri
机构
[1] Indian Institute of Technology Gandhinagar,Center for Cognitive Science
[2] Indian Institute of Technology Gandhinagar,Electrical Engineering Department
关键词
Autism; Virtual reality; Eye-tracking; Fixation duration; Pupil diameter; Blink rate; Classification; SVM;
D O I
暂无
中图分类号
学科分类号
摘要
Atypical behavioral viewing pattern is one of the core deficits of individuals with Autism Spectrum Disorder (ASD). This diminishes their ability to understand the communicator’s facial emotional expression and often misinterpret one’s intended emotion. Here, we investigated the feasibility of using one’s gaze-related indices to estimate distinctive changes corresponding to various emotions. We designed a usability study with nine individuals with ASD and Typically Developing (TD) individuals who were exposed to Virtual Reality (VR) based social scenarios. The VR scenes presented virtual characters who narrated their social experience in the form of short stories with context-relevant emotional expressions. Simultaneously, we collected one’s gaze-related physiological indices (PIs) and behavioral looking pattern indices (BIs) using a technologically-enhanced eye-tracker. Subsequently, these PIs and BIs were used to classify the implications of the emotional expressions both within and across the ASD and TD groups. Results of the usability study indicate that one’s gaze-related indices can be discriminated with 97% accuracy for various emotions for intra-group analysis and 100% accuracy for inter-group analysis.
引用
收藏
页码:2701 / 2713
页数:12
相关论文
共 50 条
  • [31] Turn-alignment using eye-gaze and speech in conversational interaction
    Jokinen, Kristiina
    Harada, Kazuaki
    Nishida, Masafumi
    Yamamoto, Seiichi
    11TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2010 (INTERSPEECH 2010), VOLS 3 AND 4, 2010, : 2018 - +
  • [32] Eye-gaze driven surgical workflow segmentation
    James, A.
    Vieira, D.
    Lo, B.
    Darzi, A.
    Yang, G. -Z.
    MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION- MICCAI 2007, PT 2, PROCEEDINGS, 2007, 4792 : 110 - +
  • [33] Study on Character Input Methods using Eye-gaze Input Interface
    Murata, Atsuo
    Hayashi, Kazuya
    Moriwaka, Makoto
    Hayami, Takehito
    2012 PROCEEDINGS OF SICE ANNUAL CONFERENCE (SICE), 2012, : 1402 - 1407
  • [34] NAVIGATING THROUGH GOOGLE MAPS USING AN EYE-GAZE INTERFACE SYSTEM
    Putra, Hanif Fermanda
    Ogata, Kohichi
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2022, 18 (02): : 417 - 432
  • [35] Adaptive control for eye-gaze input system
    Zhao, QJ
    Tu, DW
    Yin, HR
    OPTICAL MODELING AND PERFORMANCE PREDICTIONS, 2003, 5178 : 213 - 220
  • [36] Multimodal Intelligent Eye-Gaze Tracking System
    Biswas, Pradipta
    Langdon, Pat
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2015, 31 (04) : 277 - 294
  • [37] Surgical skill level classification model development using EEG and eye-gaze data and machine learning algorithms
    Shafiei, Somayeh B.
    Shadpour, Saeed
    Mohler, James L.
    Sasangohar, Farzan
    Gutierrez, Camille
    Toussi, Mehdi Seilanian
    Shafqat, Ambreen
    JOURNAL OF ROBOTIC SURGERY, 2023, 17 (06) : 2963 - 2971
  • [38] Determining comprehension and quality of TV programs using eye-gaze tracking
    Sawahata, Yasuhito
    Khosla, Rajiv
    Komine, Kazuteru
    Hiruma, Nobuyuki
    Itou, Jakayuki
    Watanabe, Seiji
    Suzuki, Yuji
    Hara, Yumiko
    Issiki, Nobuo
    PATTERN RECOGNITION, 2008, 41 (05) : 1610 - 1626
  • [39] Eye Strokes: An Eye-gaze Drawing System for Mandarin Characters
    Tong, Derek C. W.
    Tan, Xuet Ying
    Chen, Alex Q.
    PROCEEDINGS OF THE ACM ON COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES, 2024, 7 (02)
  • [40] A precise eye-gaze detection and tracking system
    Pérez, A
    Córdoba, ML
    García, A
    Méndez, R
    Muñoz, ML
    Pedraza, JL
    Sánchez, F
    WSCG'2003 POSTER PROCEEDINGS, 2003, : 105 - 108