EHTask: Recognizing User Tasks From Eye and Head Movements in Immersive Virtual Reality

被引:16
|
作者
Hu, Zhiming [1 ]
Bulling, Andreas [2 ]
Li, Sheng [1 ,3 ]
Wang, Guoping [1 ,3 ]
机构
[1] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[2] Univ Stuttgart, D-70174 Stuttgart, Germany
[3] Peking Univ, Natl Biomed Imaging Ctr, Beijing 100871, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划; 欧洲研究理事会;
关键词
Task analysis; Videos; Head; Visualization; Virtual reality; Magnetic heads; Solid modeling; Visual attention; task recognition; eye movements; head movements; deep learning; virtual reality; GAZE PREDICTION;
D O I
10.1109/TVCG.2021.3138902
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Understanding human visual attention in immersive virtual reality (VR) is crucial for many important applications, including gaze prediction, gaze guidance, and gaze-contingent rendering. However, previous works on visual attention analysis typically only explored one specific VR task and paid less attention to the differences between different tasks. Moreover, existing task recognition methods typically focused on 2D viewing conditions and only explored the effectiveness of human eye movements. We first collect eye and head movements of 30 participants performing four tasks, i.e., Free viewing, Visual search, Saliency, and Track, in 15 360-degree VR videos. Using this dataset, we analyze the patterns of human eye and head movements and reveal significant differences across different tasks in terms of fixation duration, saccade amplitude, head rotation velocity, and eye-head coordination. We then propose EHTask- a novel learning-based method that employs eye and head movements to recognize user tasks in VR. We show that our method significantly outperforms the state-of-the-art methods derived from 2D viewing conditions both on our dataset (accuracy of 84.4% versus 62.8%) and on a real-world dataset (61.9% versus 44.1%). As such, our work provides meaningful insights into human visual attention under different VR tasks and guides future work on recognizing user tasks in VR.
引用
收藏
页码:1992 / 2004
页数:13
相关论文
共 50 条
  • [41] Real-time recording and classification of eye movements in an immersive virtual environment
    Diaz, Gabriel
    Cooper, Joseph
    Kit, Dmitry
    Hayhoe, Mary
    JOURNAL OF VISION, 2013, 13 (12):
  • [42] BioMove: Biometric User Identification from Human Kinesiological Movements for Virtual Reality Systems
    Olade, Ilesanmi
    Fleming, Charles
    Liang, Hai-Ning
    SENSORS, 2020, 20 (10)
  • [43] Developing a Tutorial for Improving Usability and User Skills in an Immersive Virtual Reality Experience
    Miguel-Alonso, Ines
    Rodriguez-Garcia, Bruno
    Checa, David
    Tommaso De Paolis, Lucio
    EXTENDED REALITY, XR SALENTO 2022, PT II, 2022, 13446 : 63 - 78
  • [44] Enhancing social functioning using multi-user, immersive virtual reality
    Holt, D. J.
    Detore, N. R.
    Aideyan, B.
    Utter, L.
    Vinke, L.
    Johnson, D. S.
    Zimmerman, J.
    Dokholyan, K. N.
    Burke, A.
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [45] Virtual Reality Conferencing: Multi-user immersive VR experiences on the web
    Gunkel, Simon N. B.
    Stokking, Hans M.
    Prins, Martin J.
    van der Stap, Nanda
    ter Haar, Frank B.
    Niamut, Omar A.
    PROCEEDINGS OF THE 9TH ACM MULTIMEDIA SYSTEMS CONFERENCE (MMSYS'18), 2018, : 498 - 501
  • [46] A Framework for Developing Multi-user Immersive Virtual Reality Learning Environments
    Checa, David
    Rodriguez-Garcia, Bruno
    Guillen-Sanz, Henar
    Miguel-Alonso, Ines
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2023, 14218 LNCS : 89 - 103
  • [47] Realistic Immersion of a User into Humanoids of Different Sizes and Proportions in Immersive Virtual Reality
    Zhao, Weiwei
    Madhavan, Vis
    2013 IEEE VIRTUAL REALITY CONFERENCE (VR), 2013, : 149 - 150
  • [48] A Framework for Developing Multi-user Immersive Virtual Reality Learning Environments
    Checa, David
    Rodriguez-Garcia, Bruno
    Guillen-Sanz, Henar
    Miguel-Alonso, Ines
    EXTENDED REALITY, XR SALENTO 2023, PT I, 2023, 14218 : 89 - 103
  • [49] Microlearning in Immersive Virtual Reality: A User-Centered Analysis of Learning Interfaces
    Gill, Amarpreet
    Irwin, Derek
    Sun, Linjing
    Towey, Dave
    Zhang, Gege
    Zhang, Yanhui
    IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES, 2025, 18 : 165 - 178
  • [50] Group Decision-Making in Multi-User Immersive Virtual Reality
    Moser, Ivan
    Chiquet, Sandra
    Strahm, Sebastian K.
    Mast, Fred W.
    Bergamin, Per
    CYBERPSYCHOLOGY BEHAVIOR AND SOCIAL NETWORKING, 2020, 23 (12) : 846 - 853