Aspect Based Sentiment Analysis by Pre-trained Language Representations

被引:0
|
作者
Liang Tianxin [1 ,2 ]
Yang Xiaoping [1 ]
Zhou Xibo [2 ]
Wang Bingqian [2 ]
机构
[1] Renmin Univ China, Beijing, Peoples R China
[2] BOE Technol Grp Co Ltd, Beijing, Peoples R China
关键词
BERT; TextCNN; Sentiment Classification; BTC;
D O I
10.1109/ISPA-BDCloud-SustainCom-SocialCom48970.2019.00180
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Given a paragraph of text, the objective of aspect-level sentiment classification is to identify the sentiment polarity of a specific phrase. Most exisiting work employed LSTM model and attention mechanisms to predict the sentiment polarity of the question targets. Unfortunately, these approaches haven't fully utilize the independent modeling of these target phrases. We propose a model based on TextCNN and Transformer pre-trained model. In our model, the representations are generated for the targets and the contexts separately. We use Transformer model to help represent a target and its context via attention learning, which improves the performance of aspect-level sentiment classification. Experiments on COAE2014 and COAE2015 task show the effectiveness of our new model.(1)
引用
收藏
页码:1262 / 1265
页数:4
相关论文
共 50 条
  • [41] Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension
    Yang, An
    Wang, Quan
    Liu, Jing
    Liu, Kai
    Lyu, Yajuan
    Wu, Hua
    She, Qiaoqiao
    Li, Sujian
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 2346 - 2357
  • [42] BiTimeBERT: Extending Pre-Trained Language Representations with Bi-Temporal Information
    Wang, Jiexin
    Jatowt, Adam
    Yoshikawa, Masatoshi
    Cai, Yi
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 812 - 821
  • [43] Sentiment Caption Generation from Visual Scene Using Pre-trained Language Model
    Zhang, Xiaochen
    Li, Jin
    Xu, Mengfan
    Li, Liangfu
    Guo, Longjiang
    Song, Yunpeng
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2024, PT VI, 2025, 15206 : 187 - 201
  • [44] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    ENGINEERING, 2023, 25 : 51 - 65
  • [45] Hyperbolic Pre-Trained Language Model
    Chen, Weize
    Han, Xu
    Lin, Yankai
    He, Kaichen
    Xie, Ruobing
    Zhou, Jie
    Liu, Zhiyuan
    Sun, Maosong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 3101 - 3112
  • [46] Imparting Fairness to Pre-Trained Biased Representations
    Sadeghi, Bashir
    Boddeti, Vishnu Naresh
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 75 - 82
  • [47] A Sentiment Analysis Method for Big Social Online Multimodal Comments Based on Pre-trained Models
    Wan, Jun
    Wozniak, Marcin
    MOBILE NETWORKS & APPLICATIONS, 2024,
  • [48] A Novel World Knowledge Aware Universal Representations of Design Patterns Based on Pre-trained Language Models
    Wen, Dongzhen
    Liu, Dong
    Yao, Zheng
    Jiang, He
    Lin, Hongfei
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IV, ICIC 2024, 2024, 14878 : 233 - 244
  • [49] A Comparative Study of Different Pre-trained Language Models for Sentiment Analysis of Human-Computer Negotiation Dialogue
    Dong, Jing
    Luo, Xudong
    Zhu, Junlin
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT IV, KSEM 2024, 2024, 14887 : 301 - 317
  • [50] Fine-Grained Sentiment-Controlled Text Generation Approach Based on Pre-Trained Language Model
    Zhu, Linan
    Xu, Yifei
    Zhu, Zhechao
    Bao, Yinwei
    Kong, Xiangjie
    APPLIED SCIENCES-BASEL, 2023, 13 (01):