Optimization of Choreography Teaching with Deep Learning and Neural Networks

被引:0
|
作者
Zhou, Qianling [1 ]
Tong, Yan [2 ]
Si, Hongwei [3 ]
Zhou, Kai [4 ]
机构
[1] School of Music and Dance, Hunan Women's University, Changsha, Hunan,410004, China
[2] School of Music, South China Normal University, Guangzhou, Guangdong,510631, China
[3] Department of the History of Science, Tsinghua University, Beijing, China
[4] School of Social Development and Management, Hunan Women's University, Changsha, Hunan,410004, China
关键词
Deep learning - Learning systems - Memory architecture - Network architecture - Stamping;
D O I
暂无
中图分类号
学科分类号
摘要
To improve the development level of intelligent dance education and choreography network technology, the research mainly focuses on the automatic formation system of continuous choreography by using the deep learning method. Firstly, it overcomes the technical difficulty that the dynamic segmentation and process segmentation of the automatic generation architecture in traditional choreography cannot achieve global optimization. Secondly, it is an automatic generation architecture for end-to-end continuous dance notation with access to temporal classifiers. Based on this, a dynamic time-stamping model is designed for frame clustering. Finally, it is concluded through experiments that the model successfully achieves high-performance movement time-stamping. And combined with continuous motion recognition technology, it realizes the refined production of continuous choreography with global motion recognition and then marks motion duration. This research effectively realizes the efficient and refined production of digital continuous choreography, provides advanced technical means for choreography education, and provides useful experience for school network choreography education. © 2022 Qianling Zhou et al.
引用
收藏
相关论文
共 50 条
  • [31] Learning by optimization in random neural networks
    Atalay, V
    ADVANCES IN COMPUTER AND INFORMATION SCIENCES '98, 1998, 53 : 143 - 148
  • [32] The Optimization of Learning Rate for Neural Networks
    Huang, Weizhe
    Chen, Chi-Hua
    ASIA-PACIFIC JOURNAL OF CLINICAL ONCOLOGY, 2023, 19 : 17 - 17
  • [33] Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
    Chernoded, Andrey
    Dudko, Lev
    Myagkov, Igor
    Volkov, Petr
    XXIII INTERNATIONAL WORKSHOP HIGH ENERGY PHYSICS AND QUANTUM FIELD THEORY (QFTHEP 2017), 2017, 158
  • [34] Efficient Hyperparameter Optimization for Convolution Neural Networks in Deep Learning: A Distributed Particle Swarm Optimization Approach
    Guo, Yu
    Li, Jian-Yu
    Zhan, Zhi-Hui
    CYBERNETICS AND SYSTEMS, 2020, 52 (01) : 36 - 57
  • [35] Introduction to Machine Learning, Neural Networks, and Deep Learning
    Choi, Rene Y.
    Coyner, Aaron S.
    Kalpathy-Cramer, Jayashree
    Chiang, Michael F.
    Campbell, J. Peter
    TRANSLATIONAL VISION SCIENCE & TECHNOLOGY, 2020, 9 (02):
  • [36] Graph neural networks for deep portfolio optimization
    Ekmekcioglu, Omer
    Pinar, Mustafa C.
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (28): : 20663 - 20674
  • [37] An Optimization Strategy for Deep Neural Networks Training
    Wu, Tingting
    Zeng, Peng
    Song, Chunhe
    2022 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, COMPUTER VISION AND MACHINE LEARNING (ICICML), 2022, : 596 - 603
  • [38] Graph neural networks for deep portfolio optimization
    Ömer Ekmekcioğlu
    Mustafa Ç. Pınar
    Neural Computing and Applications, 2023, 35 : 20663 - 20674
  • [39] A Study of Optimization in Deep Neural Networks for Regression
    Chen, Chieh-Huang
    Lai, Jung-Pin
    Chang, Yu-Ming
    Lai, Chi-Ju
    Pai, Ping-Feng
    ELECTRONICS, 2023, 12 (14)
  • [40] DeepQGHO: Quantized Greedy Hyperparameter Optimization in Deep Neural Networks for on-the-Fly Learning
    Chowdhury, Anjir Ahmed
    Hossen, Md Abir
    Azam, Md Ali
    Rahman, Md Hafizur
    IEEE ACCESS, 2022, 10 : 6407 - 6416