ON THE LINK BETWEEN EMOTION, ATTENTION AND CONTENT IN VIRTUAL IMMERSIVE ENVIRONMENTS

被引:2
|
作者
Guimard, Quentin [1 ]
Robert, Florent [1 ,2 ]
Bauce, Camille [1 ]
Ducreux, Aldric [1 ]
Sassatelli, Lucile [1 ,4 ]
Wu, Hui-Yin [2 ]
Winckler, Marco [1 ,2 ]
Gros, Auriane [3 ]
机构
[1] Univ Cote Azur, CNRS, I3S, Nice, France
[2] Univ Cote Azur, Inria, Nice, France
[3] Univ Cote Azur, CHU Nice, CoBTeK, Nice, France
[4] Inst Univ France, Paris, France
来源
2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP | 2022年
关键词
360 degrees videos; saliency maps; emotions; physiological signals; gaze; REALITY;
D O I
10.1109/ICIP46576.2022.9897903
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While immersive media have been shown to generate more intense emotions, saliency information has been shown to be a key component for the assessment of their quality, owing to the various portions of the sphere (viewports) a user can attend. In this article, we investigate the tri-partite connection between user attention, user emotion and visual content in immersive environments. To do so, we present a new dataset enabling the analysis of different types of saliency, both low-level and high-level, in connection with the user's state in 360 degrees videos. Head and gaze movements are recorded along with self-reports and continuous physiological measurements of emotions. We then study how the accuracy of saliency estimators in predicting user attention depends on user-reported and physiologically-sensed emotional perceptions. Our results show that high-level saliency better predicts user attention for higher levels of arousal. We discuss how this work serves as a first step to understand and predict user attention and intents in immersive interactive environments.
引用
收藏
页码:2521 / 2525
页数:5
相关论文
共 50 条
  • [41] Digital Services in Immersive Urban Virtual Environments
    Meira, Cesar
    Freitas, Jorge
    Barbosa, Luis
    Melo, Miguel
    Bessa, Maximino
    Magalhaes, Lui
    PROCEEDINGS OF THE 2013 8TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI 2013), 2013,
  • [42] The architectural conception drawing in immersive virtual environments
    Gomez-Tone, Hugo C.
    Grau, Javier F. Raposo
    JOURNAL OF ARCHITECTURE, 2024,
  • [43] Foreign language learning in immersive virtual environments
    Chang, Benjamin
    Sheldon, Lee
    Si, Mei
    Hand, Anton
    ENGINEERING REALITY OF VIRTUAL REALITY 2012, 2012, 8289
  • [44] Collaborative Immersive Virtual Environments for Education in Geography
    Sasinka, Cenek
    Stachon, Zdenek
    Sedlak, Michal
    Chmelik, Jiri
    Herman, Lukas
    Kubicek, Petr
    Sasinkova, Alzbeta
    Dolezal, Milan
    Tejkl, Hynek
    Urbanek, Tomas
    Svatonova, Hana
    Ugwitz, Pavel
    Jurik, Vojtech
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2019, 8 (01)
  • [45] Envisioning Haptic Design for Immersive Virtual Environments
    Degraen, Donald
    Reindl, Anna
    Makhsadov, Akhmajon
    Zenner, Andre
    Kruger, Antonio
    COMPANION PUBLICATION OF THE 2020 ACM DESIGNING INTERACTIVE SYSTEMS CONFERENCE (DIS' 20 COMPANION), 2020, : 287 - 291
  • [46] Perceptually adaptive rendering of immersive virtual environments
    Weaver, Kimberly
    Parkhurst, Derrick
    SMART GRAPHICS, PROCEEDINGS, 2007, 4569 : 224 - +
  • [47] Distance perception in immersive virtual environments, revisited
    Interrante, Victoria
    Anderson, Lee
    Ries, Brian
    IEEE VIRTUAL REALITY 2006, PROCEEDINGS, 2006, : 3 - +
  • [48] Flexible gesture recognition for immersive virtual environments
    Deller, Matthias
    Ebert, Achim
    Bender, Michael
    Hagen, Hans
    INFORMATION VISUALIZATION-BOOK, 2006, : 563 - +
  • [49] Mobile Devices for Interaction in Immersive Virtual Environments
    Dias, Paulo
    Afonso, Luis
    Eliseu, Sergio
    Santos, Beatriz Sousa
    AVI'18: PROCEEDINGS OF THE 2018 INTERNATIONAL CONFERENCE ON ADVANCED VISUAL INTERFACES, 2018,
  • [50] THE ROLE OF IMMERSIVE TENDENCY IN VIRTUAL LEARNING ENVIRONMENTS
    Chiquet, S.
    9TH INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES (EDULEARN17), 2017, : 2219 - 2219