Learning behaviour recognition method of English online course based on multimodal data fusion

被引:0
|
作者
Li, Liangjie [1 ]
机构
[1] Department of Foreign Affairs, Henan Finance University, Zhengzhou,450000, China
关键词
The conventional methods for identifying English online course learning behaviours have the problems of low recognition accuracy and high time cost. Therefore; a multimodal data fusion-based method for identifying English online course learning behaviours is proposed. Firstly; the analytic hierarchy process is used for decision fusion of multimodal data of learning behaviour. Secondly; based on the fusion results of multimodal data; weight coefficients are set to minimise losses and extract learning behaviour features. Finally; based on the extracted learning behaviour characteristics; the optimal classification function is constructed to classify the learning behaviour of English online courses. Based on the transfer information of learning behaviour status; the identification of online course learning behaviour is completed. The experimental results show that the recognition accuracy of the proposed method is above 90%; and its recognition accuracy is and can shorten the recognition time of learning behaviour; with high practical application reliability. © 2024 Inderscience Enterprises Ltd;
D O I
10.1504/IJBIDM.2024.140880
中图分类号
学科分类号
摘要
引用
收藏
页码:336 / 349
相关论文
共 50 条
  • [31] Design of the Oral English Teaching Method Based on Multimodal Feature Fusion
    He, Xiaolei
    MOBILE INFORMATION SYSTEMS, 2022, 2022
  • [32] An Automatic Assessment Method for Spoken English Based on Multimodal Feature Fusion
    Zhang, Qijing
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2021, 2021
  • [33] Cosmo: Contrastive Fusion Learning with Small Data for Multimodal Human Activity Recognition
    Ouyang, Xiaomin
    Shuai, Xian
    Zhou, Jiayu
    Shi, Ivy Wang
    Xie, Zhiyuan
    Xing, Guoliang
    Huang, Jianwei
    PROCEEDINGS OF THE 2022 THE 28TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, ACM MOBICOM 2022, 2022, : 324 - 337
  • [34] Classroom learning behavior recognition method for English teaching students based on adaptive feature fusion
    Li, Shuyu
    INTERNATIONAL JOURNAL OF BIOMETRICS, 2025, 17 (1-2)
  • [35] Emotion Recognition Based on Feedback Weighted Fusion of Multimodal Emotion Data
    Wei, Wei
    Jia, Qingxuan
    Feng, Yongli
    2017 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (IEEE ROBIO 2017), 2017, : 1682 - 1687
  • [36] An effective multimodal representation and fusion method for multimodal intent recognition
    Huang, Xuejian
    Ma, Tinghuai
    Jia, Li
    Zhang, Yuanjian
    Rong, Huan
    Alnabhan, Najla
    NEUROCOMPUTING, 2023, 548
  • [37] Research on Emotion Recognition Method of Flight Training Based on Multimodal Fusion
    Wang, Wendong
    Zhang, Haoyang
    Zhang, Zhibin
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2024, 40 (20) : 6478 - 6491
  • [38] MCL: A Contrastive Learning Method for Multimodal Data Fusion in Violence Detection
    Yang, Liu
    Wu, Zhenjie
    Hong, Junkun
    Long, Jun
    IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 408 - 412
  • [39] Emotion Recognition and Classification of Film Reviews Based on Deep Learning and Multimodal Fusion
    Na, Risu
    Sun, Ning
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2022, 2022
  • [40] Contrastive Learning-Based Multimodal Fusion Model for Automatic Modulation Recognition
    Liu, Fugang
    Pan, Jingyi
    Zhou, Ruolin
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (01) : 78 - 82