Aspect Based Sentiment Analysis by Pre-trained Language Representations

被引:0
|
作者
Liang Tianxin [1 ,2 ]
Yang Xiaoping [1 ]
Zhou Xibo [2 ]
Wang Bingqian [2 ]
机构
[1] Renmin Univ China, Beijing, Peoples R China
[2] BOE Technol Grp Co Ltd, Beijing, Peoples R China
关键词
BERT; TextCNN; Sentiment Classification; BTC;
D O I
10.1109/ISPA-BDCloud-SustainCom-SocialCom48970.2019.00180
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Given a paragraph of text, the objective of aspect-level sentiment classification is to identify the sentiment polarity of a specific phrase. Most exisiting work employed LSTM model and attention mechanisms to predict the sentiment polarity of the question targets. Unfortunately, these approaches haven't fully utilize the independent modeling of these target phrases. We propose a model based on TextCNN and Transformer pre-trained model. In our model, the representations are generated for the targets and the contexts separately. We use Transformer model to help represent a target and its context via attention learning, which improves the performance of aspect-level sentiment classification. Experiments on COAE2014 and COAE2015 task show the effectiveness of our new model.(1)
引用
收藏
页码:1262 / 1265
页数:4
相关论文
共 50 条
  • [21] The Biases of Pre-Trained Language Models: An Empirical Study on Prompt-Based Sentiment Analysis and Emotion Detection
    Mao, Rui
    Liu, Qian
    He, Kai
    Li, Wei
    Cambria, Erik
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 1743 - 1753
  • [22] Neural Transfer Learning For Vietnamese Sentiment Analysis Using Pre-trained Contextual Language Models
    An Pha Le
    Tran Vu Pham
    Thanh-Van Le
    Huynh, Duy, V
    2021 IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLIED NETWORK TECHNOLOGIES (ICMLANT II), 2021, : 84 - 88
  • [23] Fusion Pre-trained Emoji Feature Enhancement for Sentiment Analysis
    Chen, Jie
    Yao, Zhiqiang
    Zhao, Shu
    Zhang, Yanping
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (04)
  • [24] Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
    Kit, Yuheng
    Mokji, Musa Mohd
    IEEE ACCESS, 2022, 10 : 107056 - 107065
  • [25] Pre-Trained Language Model-Based Deep Learning for Sentiment Classification of Vietnamese Feedback
    Loc, Cu Vinh
    Viet, Truong Xuan
    Viet, Tran Hoang
    Thao, Le Hoang
    Viet, Nguyen Hoang
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2023, 22 (03)
  • [26] HORNET: Enriching Pre-trained Language Representations with Heterogeneous Knowledge Sources
    Zhang, Taolin
    Cai, Zerui
    Wang, Chengyu
    Li, Peng
    Li, Yang
    Qiu, Minghui
    Tang, Chengguang
    He, Xiaofeng
    Huang, Jun
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2608 - 2617
  • [27] Diffused Redundancy in Pre-trained Representations
    Nanda, Vedant
    Speicher, Till
    Dickerson, John P.
    Gummadi, Krishna P.
    Feizi, Soheil
    Weller, Adrian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [28] Pre-trained Affective Word Representations
    Chawla, Kushal
    Khosla, Sopan
    Chhaya, Niyati
    Jaidka, Kokil
    2019 8TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2019,
  • [29] Multi-modal Sentiment Analysis of Mongolian Language based on Pre-trained Models and High-resolution Networks
    Yang, Yang
    Ren, Qing-Dao-Er-Ji
    He, Rui-Feng
    2024 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING, IALP 2024, 2024, : 291 - 296
  • [30] Context Analysis for Pre-trained Masked Language Models
    Lai, Yi-An
    Lalwani, Garima
    Zhang, Yi
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 3789 - 3804