Research on the Creative Performance of Digital Film and Television Works Based on Virtual Reality Technology

被引:0
|
作者
Zhang J. [1 ]
Feng Y. [1 ]
机构
[1] School of Digital Media and Performance, Sichuan Geely University, Sichuan, Chengdu
关键词
Creative performance; Digital film and television works; EEG denoising; Feature recognition; Virtual reality technology;
D O I
10.2478/amns-2024-0633
中图分类号
学科分类号
摘要
This study investigates the application of virtual reality technology in the creative expression of digital film and television productions, especially the role of EEG signal denoising and feature recognition methods in enhancing the audience experience. The study uses wavelet threshold denoising and parallel RLS adaptive filtering algorithms to process EEG signals to improve the accuracy and reliability of the data. Then, the EEG signals were feature extracted using a bi-hemispheric domain adversarial neural network (BiDANN) to more accurately recognize the user’s emotional responses. The experimental results show that in the virtual reality environment, the users’ concentration and emotional reactions are significantly improved, with the average concentration reaching 74.21 and the average value of the electrodermal test data being 6.19. In addition, the eye-movement interaction experiments show that different types of digital movie and television works can cause additional attention allocation of users in the VR environment, leading other creative performance effects. The study’s results prove that virtual reality technology can significantly enhance the innovative performance of digital movie and television works and improve the audience’s viewing experience. © 2023 Jicheng Zhang and Yi Feng, published by Sciendo.
引用
收藏
相关论文
共 50 条
  • [21] Research on Rehabilitation Exercise Based on Virtual Reality Technology
    Ma, Hongxia
    PROCEEDINGS OF THE 10TH CONFERENCE ON MAN-MACHINE-ENVIRONMENT SYSTEM ENGINEERING, 2010, : 331 - 334
  • [22] Research on Telemedicine Technology and Implement based on Virtual Reality
    Ji, Hong
    Wang, Jing
    Gao, Jia
    Liu, Xiaohui
    PROCEEDINGS OF 2016 IEEE ADVANCED INFORMATION MANAGEMENT, COMMUNICATES, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (IMCEC 2016), 2016, : 1581 - 1586
  • [23] Research on the Application of Virtual Reality Technology in Digital Media Art Teaching
    Shi Y.
    Liu Y.
    Computer-Aided Design and Applications, 2024, 21 (S17): : 171 - 186
  • [24] Virtual test research for the vehicle performance based on virtual reality
    Liang Jiahong
    Li Shilei
    Bai Youliang
    Li Meng
    ISTM/2007: 7TH INTERNATIONAL SYMPOSIUM ON TEST AND MEASUREMENT, VOLS 1-7, CONFERENCE PROCEEDINGS, 2007, : 355 - 358
  • [25] Performance and Application of Virtual Reality Technology (VR) in Digital Protection of Buildings
    Huang Jianfeng
    Gao Hua
    2020 6TH INTERNATIONAL CONFERENCE ON ENERGY MATERIALS AND ENVIRONMENT ENGINEERING, 2020, 508
  • [26] Artificial Intelligence of Internet of Things and Virtual Reality Technology in the Image Reconstruction of Film and Television Characters
    Guo, Hanlin
    MACHINE LEARNING, IMAGE PROCESSING, NETWORK SECURITY AND DATA SCIENCES, MIND 2022, PT II, 2022, 1763 : 146 - 153
  • [27] The Path of Film and Television Animation Creation Using Virtual Reality Technology under the Artificial Intelligence
    Liu, Xin
    Pan, Hua
    SCIENTIFIC PROGRAMMING, 2022, 2022
  • [28] Research on Postproduction of Film and Television Based on Computer Multimedia Technology
    Xu, Xiao
    Yan, Hao
    Wang, Xiaolei
    SCIENTIFIC PROGRAMMING, 2022, 2022
  • [29] The Study of Film and TV Cultural Creative Industry Based on Digital Technology
    Liu Jingchen
    Xiang Zhongping
    2009 INTERNATIONAL CONFERENCE ON ENVIRONMENTAL SCIENCE AND INFORMATION APPLICATION TECHNOLOGY,VOL I, PROCEEDINGS, 2009, : 705 - +
  • [30] Effects of virtual reality on creative design performance and creative experiential learning
    Chang, Yu-Shan
    Chou, Chia-Hui
    Chuang, Meng-Jung
    Li, Wen-Hung
    Tsai, I-Fan
    INTERACTIVE LEARNING ENVIRONMENTS, 2023, 31 (02) : 1142 - 1157