GeDa: Improving training data with large language models for Aspect Sentiment Triplet Extraction

被引:3
|
作者
Mai, Weixing [1 ]
Zhang, Zhengxuan [1 ]
Chen, Yifan [1 ]
Li, Kuntao [1 ]
Xue, Yun [1 ]
机构
[1] South China Normal Univ, Sch Elect & Informat Engn, Guangdong Prov Key Lab Quantum Engn & Quantum Mat, Foshan 528225, Peoples R China
基金
中国国家自然科学基金;
关键词
Aspect sentiment triplet extraction; Improving training data; Large language models; Targeted data selection;
D O I
10.1016/j.knosys.2024.112289
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect Sentiment Triplet Extraction (ASTE) is a subtask of Aspect-based Sentiment Analysis (ABSA). Recently, ASTE methods have achieved promising results. However, the performance of ASTE models is restricted to both the quantity and the quality of training data. As such, challenges lie in collecting valuable data and selecting targeted data for diversified ASTE model architecture. To this end, we propose a novel Ge neral Da ta-Centric Framework (GeDa), GeDa ), which is capable of improving the training data for ASTE models accurately and efficiently. Specifically, two types of prompts are designed to guide large language models in synthetic candidates synthesizing for ASTE task. Then, the Characteristic-Driven Iterative Strategy is put forward to optimize the interaction between the model and the training data. The data is iteratively selected from the synthetic candidates, aiming to improve the quantity and the quality of training data. With multiple iterations, a targeted training set can be obtained to benefit ASTE model learning. Extensive experiments reveal that ASTE models with GeDa reach a more than 5% increment on average F1 by adding only a small amount of training data.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] DTS: A Decoupled Task Specificity Approach for Aspect Sentiment Triplet Extraction
    Wang, Bao
    Wang, Guangjin
    Liu, Peiyu
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 273
  • [32] Aspect Sentiment Triplet Extraction Based on Deep Relationship Enhancement Networks
    Peng, Jun
    Su, Baohua
    APPLIED SCIENCES-BASEL, 2024, 14 (05):
  • [33] A dual relation-encoder network for aspect sentiment triplet extraction
    Xia, Tian
    Sun, Xia
    Yang, Yidong
    Long, Yunfei
    Sutcliffe, Richard
    NEUROCOMPUTING, 2024, 597
  • [34] Quantification of part-of-speech relationships for aspect sentiment triplet extraction
    Wang, Jiacan
    Liu, Jianhua
    Ke, Tianci
    Chen, Kewei
    Cai, Zijie
    Xu, Ge
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2025,
  • [35] Improving span-based Aspect Sentiment Triplet Extraction with part-of-speech filtering and contrastive learning
    Li, Qingling
    Wen, Wushao
    Qin, Jinghui
    NEURAL NETWORKS, 2024, 177
  • [36] Aspect-Based Sentiment Analysis of Patient Feedback Using Large Language Models
    Alkhnbashi, Omer S.
    Mohammad, Rasheed
    Hammoudeh, Mohammad
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (12)
  • [37] Exploring large language models for the generation of synthetic training samples for aspect-based sentiment analysis in low resource settings
    Hellwig, Nils Constantin
    Fehle, Jakob
    Wolff, Christian
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 261
  • [38] Measuring Your ASTE Models in The Wild: A Diversified Multi-domain Dataset For Aspect Sentiment Triplet Extraction
    Xu, Ting
    Yang, Huiyun
    Wu, Zhen
    Chen, Jiaze
    Zhao, Fei
    Dai, Xinyu
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 2837 - 2853
  • [39] Sentiment trading with large language models
    Kirtac, Kemal
    Germano, Guido
    FINANCE RESEARCH LETTERS, 2024, 62
  • [40] Leveraging hierarchical language models for aspect-based sentiment analysis on financial data
    Lengkeek, Matteo
    Knaap, Finn van der
    Frasincar, Flavius
    INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (05)