Transformer-Based Sequence Modeling Short Answer Assessment Framework

被引:0
|
作者
Sharmila, P. [1 ]
Anbananthen, Kalaiarasi Sonai Muthu [2 ]
Chelliah, Deisy [1 ]
Parthasarathy, S. [1 ]
Balasubramaniam, Baarathi [2 ]
Lurudusamy, Saravanan Nathan [3 ]
机构
[1] Thiagarajar College of Engineering, Tamilnadu, Madurai,625015, India
[2] Faculty of Information Science and Technology, Multimedia University, Melaka,75450, Malaysia
[3] Division Consulting & Technology Services, Telekom Malaysia, Kuala Lumpur,50672, Malaysia
来源
HighTech and Innovation Journal | 2024年 / 5卷 / 03期
关键词
Modeling languages - Natural language processing systems;
D O I
10.28991/hij-2024-05-03-06
中图分类号
学科分类号
摘要
Automated subjective assessment presents a significant challenge due to the complex nature of human language and reasoning characterized by semantic variability, subjectivity, language ambiguity, and judgment levels. Unlike objective exams, subjective assessments involve diverse answers, posing difficulties in automated scoring. The paper proposes a novel approach that integrates advanced natural language processing (NLP) techniques with principled grading methods to address this challenge. Combining Transformer-based Sequence Language Modeling with sophisticated grading mechanisms aims to develop more accurate and efficient automatic grading systems for subjective assessments in education. The proposed approach consists of three main phases: Content Summarization: Relevant sentences are extracted using self-attention mechanisms, enabling the system to effectively summarize the content of the responses. Key Term Identification and Comparison: Key terms are identified within the responses and treated as overt tags. These tags are then compared to reference keys using cross-attention mechanisms, allowing for a nuanced evaluation of the response content. Grading Process: Responses are graded using a weighted multi-criteria decision method, which assesses various quality aspects and assigns partial scores accordingly. Experimental results on the SQUAD dataset demonstrate the approach’s effectiveness, achieving an impressive F-score of 86%. Furthermore, significant improvements in metrics like ROUGE, BLEU, and METEOR scores were observed, validating the efficacy of the proposed approach in automating subjective assessment tasks. © 2024, Ital Publication. All rights reserved.
引用
收藏
页码:627 / 639
相关论文
共 50 条
  • [21] A Transformer-Based Framework for Payload Malware Detection and Classification
    Stein, Kyle
    Mahyari, Arash
    Francia, Guillermo, III
    El-Sheikh, Eman
    2024 IEEE 5TH ANNUAL WORLD AI IOT CONGRESS, AIIOT 2024, 2024, : 0105 - 0111
  • [22] A Transformer-Based Bridge Structural Response Prediction Framework
    Li, Ziqi
    Li, Dongsheng
    Sun, Tianshu
    SENSORS, 2022, 22 (08)
  • [23] Enhancing tourism demand forecasting with a transformer-based framework
    Li, Xin
    Xu, Yechi
    Law, Rob
    Wang, Shouyang
    ANNALS OF TOURISM RESEARCH, 2024, 107
  • [24] A Transformer-Based Framework for Biomedical Information Retrieval Systems
    Hall, Karl
    Jayne, Chrisina
    Chang, Victor
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 317 - 331
  • [25] TBMF Framework: A Transformer-Based Multilevel Filtering Framework for PD Detection
    Xu, Ning
    Wang, Wensong
    Fulnecek, Jan
    Kabot, Ondrej
    Misak, Stanislav
    Wang, Lipo
    Zheng, Yuanjin
    Gooi, Hoay Beng
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (04) : 4098 - 4107
  • [26] TRANSFORMER-BASED ACOUSTIC MODELING FOR HYBRID SPEECH RECOGNITION
    Wang, Yongqiang
    Mohamed, Abdelrahman
    Le, Duc
    Liu, Chunxi
    Xiao, Alex
    Mahadeokar, Jay
    Huang, Hongzhao
    Tjandra, Andros
    Zhang, Xiaohui
    Zhang, Frank
    Fuegen, Christian
    Zweig, Geoffrey
    Seltzer, Michael L.
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 6874 - 6878
  • [27] Transformer-based Acoustic Modeling for Streaming Speech Synthesis
    Wu, Chunyang
    Xiu, Zhiping
    Shi, Yangyang
    Kalinli, Ozlem
    Fuegen, Christian
    Koehler, Thilo
    He, Qing
    INTERSPEECH 2021, 2021, : 146 - 150
  • [28] Sparse Transformer-Based Sequence Generation for Visual Object Tracking
    Tian, Dan
    Liu, Dong-Xin
    Wang, Xiao
    Hao, Ying
    IEEE ACCESS, 2024, 12 : 154418 - 154425
  • [29] A Transformer-based Framework for Multivariate Time Series Representation Learning
    Zerveas, George
    Jayaraman, Srideepika
    Patel, Dhaval
    Bhamidipaty, Anuradha
    Eickhoff, Carsten
    KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2114 - 2124
  • [30] A transformer-based deep learning framework to predict employee attrition
    Li, Wenhui
    PEERJ COMPUTER SCIENCE, 2023, 9