Musical Performances in Virtual Reality with Spatial and View-Dependent Audio Descriptions for Blind and Low-Vision Users

被引:0
|
作者
Dang, Khang [1 ]
Lee, Sooyeon [1 ]
机构
[1] New Jersey Inst Technol, Newark, NJ 07102 USA
关键词
Virtual Reality; Accessibility; Audio Descriptions; Musical Performances;
D O I
10.1145/3663548.3688492
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Virtual reality (VR), inherently reliant on spatial interaction, poses significant accessibility barriers for individuals who are blind or have low vision (BLV). Traditional audio descriptions (AD) typically provide a verbal explanation of visual elements in 2D or flat video media, facilitating access for BLV audiences but failing to convey the complex spatial information essential in VR. This shortfall is especially pronounced in musical performances, where understanding the spatial arrangement of the stage setup and movements of performers is crucial. To overcome these limitations, we have developed two AD approaches-Spatial AD for a dance performance and View-dependent AD for an instrumental performance-within VR-based 360 degrees environments. Spatial AD employs spatial audio technology to align descriptions with corresponding visuals, dynamically adjusting to follow the visuals, such as the movements of performers in the dance performance. Meanwhile, View-dependent AD adapts descriptions based on the orientation of the VR headset, activating when particular visuals enter the central view of the camera, ensuring that the description aligns with the user's attention directed to a particular location within the VR environment. These methods are designed as enhancements to traditional AD, aiming to improve spatial orientation and immersive experiences for BLV audiences. This demonstration showcases the potential of these AD approaches to improve interaction and engagement, furthering the development of inclusive virtual environments.
引用
收藏
页数:5
相关论文
共 6 条
  • [1] Opportunities for Accessible Virtual Reality Design for Immersive Musical Performances for Blind and Low-Vision People
    Dang, Khang
    Korreshi, Hamdi
    Iqbal, Yasir
    Lee, Sooyeon
    ACM SYMPOSIUM ON SPATIAL USER INTERACTION, SUI 2023, 2023,
  • [2] SPICA: Interactive Video Content Exploration through Augmented Audio Descriptions for Blind or Low-Vision Viewers
    Ning, Zheng
    Wimer, Brianna L.
    Jiang, Kaiwen
    Chen, Keyi
    Ban, Jerrick
    Tian, Yapeng
    Zhao, Yuhang
    Li, Toby Jia-Jun
    PROCEEDINGS OF THE 2024 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYTEMS (CHI 2024), 2024,
  • [3] Sonically Spatialized Screen Reading: Aiming to Restore Spatial Information for Blind and Low-Vision Users
    Cofino, Jonathan
    Barreto, Armando
    Adjouadi, Malek
    2012 PROCEEDINGS OF IEEE SOUTHEASTCON, 2012,
  • [4] Assessing Mobility of Blind and Low-Vision Individuals Through a Portable Virtual Reality System and a Comprehensive Questionnaire
    Isaksson-Daun, Johan
    Jansson, Tomas
    Nilsson, Johan
    IEEE ACCESS, 2024, 12 : 146089 - 146106
  • [5] Using Portable Virtual Reality to Assess Mobility of Blind and Low-Vision Individuals With the Audomni Sensory Supplementation Feedback
    Isaksson-Daun, Johan
    Jansson, Tomas
    Nilsson, Johan
    IEEE ACCESS, 2024, 12 : 26222 - 26241
  • [6] The Cross-Sensory Globe: Participatory Design of a 3D Audio-Tactile Globe Prototype for Blind and Low-Vision Users to Learn Geography
    Ghodke, Uttara
    Yusim, Lena
    Somanath, Sowmya
    Coppin, Peter
    PROCEEDINGS OF THE 2019 ACM DESIGNING INTERACTIVE SYSTEMS CONFERENCE (DIS 2019), 2019, : 399 - 412