Cybersickness Reduction via Gaze-Contingent Image Deformation

被引:0
|
作者
Groth, Colin [1 ]
Magnor, Marcus [1 ]
Grogorick, Steve [1 ]
Eisemann, Martin [1 ]
Didyk, Piotr [2 ]
机构
[1] TU Braunschweig, Inst Comp Graph, Braunschweig, Germany
[2] Univ Svizzera Italiana, Lugano, Switzerland
来源
ACM TRANSACTIONS ON GRAPHICS | 2024年 / 43卷 / 04期
基金
欧洲研究理事会;
关键词
Virtual Reality; Cybersickness; VR; Vection; Foveation; Image Distortion; SIMULATOR SICKNESS; MOTION; VIEW;
D O I
10.1145/3658138
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Virtual reality has ushered in a revolutionary era of immersive content perception. However, a persistent challenge in dynamic environments is the occurrence of cybersickness arising from a conflict between visual and vestibular cues. Prior techniques have demonstrated that limiting illusory self-motion, so-called vection, by blurring the peripheral part of images, introducing tunnel vision, or altering the camera path can effectively reduce the problem. Unfortunately, these methods often alter the user's experience with visible changes to the content. In this paper, we propose a new technique for reducing vection and combating cybersickness by subtly lowering the screen-space speed of objects in the user's peripheral vision. The method is motivated by our hypothesis that small modifications to the objects' velocity in the periphery and geometrical distortions in the peripheral vision can remain unnoticeable yet lead to reduced vection. This paper describes the experiments supporting this hypothesis and derives its limits. Furthermore, we present a method that exploits these findings by introducing subtle, screen-space geometrical distortions to animation frames to counteract the motion contributing to vection. We implement the method as a real-time post-processing step that can be integrated into existing rendering frameworks. The final validation of the technique and comparison to an alternative approach confirms its effectiveness in reducing cybersickness.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Gaze-contingent adaptation of VR stereo parameters for cybersickness prevention
    Terzioglu, Berkay
    Celikcan, Ufuk
    Capin, Tolga Kurtulus
    VISUAL COMPUTER, 2024, 40 (07): : 5017 - 5028
  • [2] A gaze-contingent display for gaze guidance
    Dorr, M.
    Barth, E.
    PERCEPTION, 2009, 38 : 46 - 46
  • [3] Reliability of gaze-contingent perimetry
    Thomas, Nikita
    Acton, Jennifer H.
    Erichsen, Jonathan T.
    Redmond, Tony
    Dunn, Matt J.
    BEHAVIOR RESEARCH METHODS, 2024, 56 (05) : 4883 - 4892
  • [4] Gaze-contingent Screen Adjustment
    Vora, Anshul
    Shah, Rishabh
    Pottekkatt, Hrutwin
    Parmar, Deepen
    PROCEEDINGS OF THE 2018 SECOND INTERNATIONAL CONFERENCE ON INVENTIVE COMMUNICATION AND COMPUTATIONAL TECHNOLOGIES (ICICCT), 2018, : 565 - 568
  • [5] Gaze-contingent displays: A review
    Duchowski, AT
    Cournia, N
    Murphy, H
    CYBERPSYCHOLOGY & BEHAVIOR, 2004, 7 (06): : 621 - 634
  • [6] Hybrid Image/Model-Based Gaze-Contingent Rendering
    Murphy, Hunter A.
    Duchowski, Andrew T.
    Tyrrell, Richard A.
    ACM TRANSACTIONS ON APPLIED PERCEPTION, 2009, 5 (04)
  • [7] Multi-modality gaze-contingent displays for image fusion
    Nikolov, SG
    Bull, DR
    Canagarajah, CN
    Jones, MG
    Gilchrist, ID
    PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON INFORMATION FUSION, VOL II, 2002, : 1213 - 1220
  • [8] A system for gaze-contingent image analysis and multi-sensorial image display
    Nikolov, SG
    Gilchrist, ID
    Bull, DR
    Canagarajah, CN
    Jones, MG
    FUSION 2003: PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE OF INFORMATION FUSION, VOLS 1 AND 2, 2003, : 749 - +
  • [9] GeoGCD: Improved Visual Search via Gaze-Contingent Display
    Bektas, Kenan
    Coltekin, Arzu
    Krueger, Jens
    Duchowski, Andrew T.
    Fabrikant, Sara Irina
    ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2019,
  • [10] Implicit learning of gaze-contingent events
    Tom Beesley
    Daniel Pearson
    Mike Le Pelley
    Psychonomic Bulletin & Review, 2015, 22 : 800 - 807