Reducing tail entity hallucinations with dependency edge prediction in text to text transfer transformer based auto-generated questions

被引:0
|
作者
R. Tharaniya sairaj [1 ]
S. R. Balasundaram [2 ]
机构
[1] National Institute of Technology,Department of Computer Applications
[2] National Institute of Technology,Department of Computer Applications
关键词
Pre-trained language model; Named entities; Hallucination; Down-stream transformer model; Automatic question generation;
D O I
10.1007/s41870-024-02205-1
中图分类号
学科分类号
摘要
The key to automatic question generation (AQG) is the selection of the tail entity (named entity in latter part of the question), which is vital concern for generating coherent and relevant questions. To address this requirement, advanced techniques such as positional encoding and dependency parsing are widely adopted. Positional encoding captures the sequential representation of words by analyzing context and structure, while dependency parsing identifies grammatical relationships to ensure linguistic coherence. Dependency parsing techniques such as transition-based and graph-based methods are widely employed to model source text sequences and generate accurate dependency relationships. However, the applicability of the existing dependency models to capture complex semantic relationships in the source text with subject or domain specific nouns is quite challenging. The evolution of AQG is recently approached by the application of Transformer models and end-to-end neural network architectures. To address this challenge, a hybrid model is proposed with TreeRNN and T5 Transformer to improve the AQG process. This model focuses on two main objectives: (1) training TreeRNN for dependency-aware tail entity extraction and (2) fine-tuning T5 Transformer with multi-shot tail entity prompting. Experimental results show that the proposed TreeRNN significantly enhances performance metrics such as Labelled Attachment Score and Unlabelled Attachment Score. Moreover, performance analysis of the proposed model for domain-specific question generation highlights the model's effectiveness, yielding substantial auto-generated questions.
引用
收藏
页码:5407 / 5419
页数:12
相关论文
共 2 条
  • [1] Enhance Text-to-Text Transfer Transformer with Generated Questions for Thai Question Answering
    Phakmongkol, Puri
    Vateekul, Peerapon
    APPLIED SCIENCES-BASEL, 2021, 11 (21):
  • [2] GTAE: Graph Transformer-Based Auto-Encoders for Linguistic-Constrained Text Style Transfer
    Shi, Yukai
    Zhang, Sen
    Zhou, Chenxing
    Liang, Xiaodan
    Yang, Xiaojun
    Lin, Liang
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2021, 12 (03)