Transformer-Based Sequence Modeling Short Answer Assessment Framework

被引:0
|
作者
Sharmila, P. [1 ]
Anbananthen, Kalaiarasi Sonai Muthu [2 ]
Chelliah, Deisy [1 ]
Parthasarathy, S. [1 ]
Balasubramaniam, Baarathi [2 ]
Lurudusamy, Saravanan Nathan [3 ]
机构
[1] Thiagarajar College of Engineering, Tamilnadu, Madurai,625015, India
[2] Faculty of Information Science and Technology, Multimedia University, Melaka,75450, Malaysia
[3] Division Consulting & Technology Services, Telekom Malaysia, Kuala Lumpur,50672, Malaysia
来源
HighTech and Innovation Journal | 2024年 / 5卷 / 03期
关键词
Modeling languages - Natural language processing systems;
D O I
10.28991/hij-2024-05-03-06
中图分类号
学科分类号
摘要
Automated subjective assessment presents a significant challenge due to the complex nature of human language and reasoning characterized by semantic variability, subjectivity, language ambiguity, and judgment levels. Unlike objective exams, subjective assessments involve diverse answers, posing difficulties in automated scoring. The paper proposes a novel approach that integrates advanced natural language processing (NLP) techniques with principled grading methods to address this challenge. Combining Transformer-based Sequence Language Modeling with sophisticated grading mechanisms aims to develop more accurate and efficient automatic grading systems for subjective assessments in education. The proposed approach consists of three main phases: Content Summarization: Relevant sentences are extracted using self-attention mechanisms, enabling the system to effectively summarize the content of the responses. Key Term Identification and Comparison: Key terms are identified within the responses and treated as overt tags. These tags are then compared to reference keys using cross-attention mechanisms, allowing for a nuanced evaluation of the response content. Grading Process: Responses are graded using a weighted multi-criteria decision method, which assesses various quality aspects and assigns partial scores accordingly. Experimental results on the SQUAD dataset demonstrate the approach’s effectiveness, achieving an impressive F-score of 86%. Furthermore, significant improvements in metrics like ROUGE, BLEU, and METEOR scores were observed, validating the efficacy of the proposed approach in automating subjective assessment tasks. © 2024, Ital Publication. All rights reserved.
引用
收藏
页码:627 / 639
相关论文
共 50 条
  • [41] Development of a Text Classification Framework using Transformer-based Embeddings
    Yeasmin, Sumona
    Afrin, Nazia
    Saif, Kashfia
    Huq, Mohammad Rezwanul
    PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON DATA SCIENCE, TECHNOLOGY AND APPLICATIONS (DATA), 2022, : 74 - 82
  • [42] Transformer-based contrastive learning framework for image anomaly detection
    Wentao Fan
    Weimin Shangguan
    Yewang Chen
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 3413 - 3426
  • [43] Operational prediction of solar flares using a transformer-based framework
    Abduallah, Yasser
    Wang, Jason T. L.
    Wang, Haimin
    Xu, Yan
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [44] Transformer-based sensor failure prediction and classification framework for UAVs
    Ahmad, Muhammad Waqas
    Akram, Muhammad Usman
    Mohsan, Mashood Mohammad
    Saghar, Kashif
    Ahmad, Rashid
    Butt, Wasi Haider
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 248
  • [45] TFTN: A Transformer-Based Fusion Tracking Framework of Hyperspectral and RGB
    Zhao, Chunhui
    Liu, Hongjiao
    Su, Nan
    Yan, Yiming
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [46] Transformer-Based Topic Modeling for Urdu Translations of the Holy Quran
    Zafar, Amna
    Wasim, Muhammad
    Zulfiqar, Shaista
    Waheed, Talha
    Siddique, Abubakar
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (10)
  • [47] Robust Team Communication Analytics with Transformer-Based Dialogue Modeling
    Pande, Jay
    Min, Wookhee
    Spain, Randall D.
    Saville, Jason D.
    Lester, James
    ARTIFICIAL INTELLIGENCE IN EDUCATION, AIED 2023, 2023, 13916 : 639 - 650
  • [48] TTVAE: Transformer-based generative modeling for tabular data generation
    Wang, Alex X.
    Nguyen, Binh P.
    ARTIFICIAL INTELLIGENCE, 2025, 340
  • [49] Zero-shot Sequence Labeling for Transformer-based Sentence Classifiers
    Bujel, Kamil
    Yannakoudakis, Helen
    Rei, Marek
    REPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2021, : 195 - 205
  • [50] Reward modeling for mitigating toxicity in transformer-based language models
    Farshid Faal
    Ketra Schmitt
    Jia Yuan Yu
    Applied Intelligence, 2023, 53 : 8421 - 8435