GeDa: Improving training data with large language models for Aspect Sentiment Triplet Extraction

被引:3
|
作者
Mai, Weixing [1 ]
Zhang, Zhengxuan [1 ]
Chen, Yifan [1 ]
Li, Kuntao [1 ]
Xue, Yun [1 ]
机构
[1] South China Normal Univ, Sch Elect & Informat Engn, Guangdong Prov Key Lab Quantum Engn & Quantum Mat, Foshan 528225, Peoples R China
基金
中国国家自然科学基金;
关键词
Aspect sentiment triplet extraction; Improving training data; Large language models; Targeted data selection;
D O I
10.1016/j.knosys.2024.112289
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect Sentiment Triplet Extraction (ASTE) is a subtask of Aspect-based Sentiment Analysis (ABSA). Recently, ASTE methods have achieved promising results. However, the performance of ASTE models is restricted to both the quantity and the quality of training data. As such, challenges lie in collecting valuable data and selecting targeted data for diversified ASTE model architecture. To this end, we propose a novel Ge neral Da ta-Centric Framework (GeDa), GeDa ), which is capable of improving the training data for ASTE models accurately and efficiently. Specifically, two types of prompts are designed to guide large language models in synthetic candidates synthesizing for ASTE task. Then, the Characteristic-Driven Iterative Strategy is put forward to optimize the interaction between the model and the training data. The data is iteratively selected from the synthetic candidates, aiming to improve the quantity and the quality of training data. With multiple iterations, a targeted training set can be obtained to benefit ASTE model learning. Extensive experiments reveal that ASTE models with GeDa reach a more than 5% increment on average F1 by adding only a small amount of training data.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Multiscale feature aggregation network for aspect sentiment triplet extraction
    Zhu, Linan
    Xu, Minhao
    Zhu, Zhechao
    Xu, Yifei
    Kong, Xiangjie
    APPLIED INTELLIGENCE, 2023, 53 (14) : 17762 - 17777
  • [22] A semantically enhanced dual encoder for aspect sentiment triplet extraction
    Jiang, Baoxing
    Liang, Shehui
    Liu, Peiyu
    Dong, Kaifang
    Li, Hongye
    NEUROCOMPUTING, 2023, 562
  • [23] Bidirectional Machine Reading Comprehension for Aspect Sentiment Triplet Extraction
    Chen, Shaowei
    Wang, Yu
    Liu, Jie
    Wang, Yuelin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 12666 - 12674
  • [24] Multiscale feature aggregation network for aspect sentiment triplet extraction
    Linan Zhu
    Minhao Xu
    Zhechao Zhu
    Yifei Xu
    Xiangjie Kong
    Applied Intelligence, 2023, 53 : 17762 - 17777
  • [25] Dual-Channel Span for Aspect Sentiment Triplet Extraction
    Li, Pan
    Li, Ping
    Zhang, Kai
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 248 - 261
  • [26] Document-Level Sentiment Knowledge Transfer Network for Aspect Sentiment Triplet Extraction
    Tan, Long
    Su, Zixian
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 377 - 382
  • [27] Neural transition model for aspect-based sentiment triplet extraction with triplet memory
    Wu, Shengqiong
    Li, Bobo
    Xie, Dongdong
    Teng, Chong
    Ji, Donghong
    NEUROCOMPUTING, 2021, 463 : 45 - 58
  • [28] Enhanced Packed Marker with Entity Information for Aspect Sentiment Triplet Extraction
    Li, You
    Zeng, Xupeng
    Zeng, Yixiao
    Lin, Yuming
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 619 - 629
  • [29] INTEGRATED KNOWLEDGE GUIDANCE AND DEPENDENCY ENHANCEMENT FOR ASPECT SENTIMENT TRIPLET EXTRACTION
    Jia, Xian
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2024, 25 (06) : 1325 - 1342
  • [30] Learning Span-Level Interactions for Aspect Sentiment Triplet Extraction
    Xu, Lu
    Chia, Yew Ken
    Bing, Lidong
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4755 - 4766