Transformer-Based Sequence Modeling Short Answer Assessment Framework

被引:0
|
作者
Sharmila, P. [1 ]
Anbananthen, Kalaiarasi Sonai Muthu [2 ]
Chelliah, Deisy [1 ]
Parthasarathy, S. [1 ]
Balasubramaniam, Baarathi [2 ]
Lurudusamy, Saravanan Nathan [3 ]
机构
[1] Thiagarajar College of Engineering, Tamilnadu, Madurai,625015, India
[2] Faculty of Information Science and Technology, Multimedia University, Melaka,75450, Malaysia
[3] Division Consulting & Technology Services, Telekom Malaysia, Kuala Lumpur,50672, Malaysia
来源
HighTech and Innovation Journal | 2024年 / 5卷 / 03期
关键词
Modeling languages - Natural language processing systems;
D O I
10.28991/hij-2024-05-03-06
中图分类号
学科分类号
摘要
Automated subjective assessment presents a significant challenge due to the complex nature of human language and reasoning characterized by semantic variability, subjectivity, language ambiguity, and judgment levels. Unlike objective exams, subjective assessments involve diverse answers, posing difficulties in automated scoring. The paper proposes a novel approach that integrates advanced natural language processing (NLP) techniques with principled grading methods to address this challenge. Combining Transformer-based Sequence Language Modeling with sophisticated grading mechanisms aims to develop more accurate and efficient automatic grading systems for subjective assessments in education. The proposed approach consists of three main phases: Content Summarization: Relevant sentences are extracted using self-attention mechanisms, enabling the system to effectively summarize the content of the responses. Key Term Identification and Comparison: Key terms are identified within the responses and treated as overt tags. These tags are then compared to reference keys using cross-attention mechanisms, allowing for a nuanced evaluation of the response content. Grading Process: Responses are graded using a weighted multi-criteria decision method, which assesses various quality aspects and assigns partial scores accordingly. Experimental results on the SQUAD dataset demonstrate the approach’s effectiveness, achieving an impressive F-score of 86%. Furthermore, significant improvements in metrics like ROUGE, BLEU, and METEOR scores were observed, validating the efficacy of the proposed approach in automating subjective assessment tasks. © 2024, Ital Publication. All rights reserved.
引用
收藏
页码:627 / 639
相关论文
共 50 条
  • [1] Explaining transformer-based models for automatic short answer grading
    Poulton, Andrew
    Eliens, Sebas
    5TH INTERNATIONAL CONFERENCE ON DIGITAL TECHNOLOGY IN EDUCATION, ICDTE 2021, 2021, : 110 - 116
  • [2] Improving Short Answer Grading Using Transformer-Based Pre-training
    Sung, Chul
    Dhamecha, Tejas Indulal
    Mukhi, Nirmal
    ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2019), PT I, 2019, 11625 : 469 - 481
  • [3] Reranking for Efficient Transformer-based Answer Selection
    Matsubara, Yoshitomo
    Vu, Thuy
    Moschitti, Alessandro
    PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, : 1577 - 1580
  • [4] A transformer-based unified multimodal framework for Alzheimer's disease assessment
    Department of Big Data in Health Science, School of Public Health and Center of Clinical Big Data and Analytics of The Second Affiliated Hospital, Zhejiang University School of Medicine, Zhejiang, Hangzhou, China
    不详
    130024, China
    Comput. Biol. Med.,
  • [5] A novel transformer-based semantic segmentation framework for structural condition assessment
    Wang, Ruhua
    Shao, Yanda
    Li, Qilin
    Li, Ling
    Li, Jun
    Hao, Hong
    STRUCTURAL HEALTH MONITORING-AN INTERNATIONAL JOURNAL, 2024, 23 (02): : 1170 - 1183
  • [6] Empowering Short Answer Grading: Integrating Transformer-Based Embeddings and BI-LSTM Network
    Gomaa, Wael H.
    Nagib, Abdelrahman E.
    Saeed, Mostafa M.
    Algarni, Abdulmohsen
    Nabil, Emad
    BIG DATA AND COGNITIVE COMPUTING, 2023, 7 (03)
  • [7] Transformer-Based Model Predictive Control: Trajectory Optimization via Sequence Modeling
    Celestini, Davide
    Gammelli, Daniele
    Guffanti, Tommaso
    D'Amico, Simone
    Capello, Elisa
    Pavone, Marco
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (11): : 9820 - 9827
  • [8] Transformer-Based Neural Network for Answer Selection in Question Answering
    Shao, Taihua
    Guo, Yupu
    Chen, Honghui
    Hao, Zepeng
    IEEE ACCESS, 2019, 7 : 26146 - 26156
  • [9] A Transformer-Based Framework for Tiny Object Detection
    Liao, Yi-Kai
    Lin, Gong-Si
    Yeh, Mei-Chen
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 373 - 377
  • [10] A transformer-based adversarial network framework for steganography
    Xiao, Chaoen
    Peng, Sirui
    Zhang, Lei
    Wang, Jianxin
    Ding, Ding
    Zhang, Jianyi
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 269