Cross-domain sentiment classification using decoding-enhanced bidirectional encoder representations from transformers with disentangled attention

被引:3
|
作者
Singh, Rahul Kumar [1 ,2 ,4 ]
Sachan, Manoj Kumar [1 ]
Patel, Ram Bahadur [3 ]
机构
[1] St Longowal Inst Engn & Technol, Dept Comp Sci & Engn, Sangrur, India
[2] Univ Petr & Energy Studies, Sch Comp Sci, Dehra Dun, India
[3] Chandigarh Coll Engn & Technol, Dept Comp Sci & Engn Degree Wing, Chandigarh, India
[4] Univ Petr & Energy Studies, Sch Comp Sci, Dehra Dun 248007, Uttrakhand, India
来源
CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE | 2023年 / 35卷 / 06期
关键词
cross-domain sentiment classification; DeBERTa; domain adaptation; opinion mining; sentiment analysis;
D O I
10.1002/cpe.7589
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Cross-domain sentiment classification is a significant task of sentiment analysis that objectives to predict the opinion orientation of text documents in the target domain by using the source domain's learned classifier. Most of the existing approaches of domain-adaptation in sentiment classification focus on sharing low-dimensional features across the domain using domain independent and specific features to mitigate the gap between domains. Earlier cross-domain sentiment classification approaches mainly focused on document level and sentence level, they cannot consider the full impact of aspect words, position of the words, and long-term dependencies. To address this concern, we propose a model for cross-domain sentiment classification, which is based on decoding-enhanced BERT with disentangled attention (DeBERTa). DeBERTa is a pretrained language model based on transformer architecture. In this article, we perform sentence and aspect embedding to mine wordpiece information from text document. DeBERTa language-model utilize disentangled attention mechanism and an enhanced mask decoder to understand the expression features. Disentangled attention mechanism is used to encode each word into two vectors (i.e., content and position vector). In order to predict the masked tokens during model pretraining, an enhanced mask decoder is employed, which incorporates absolute positions in the decoding layer. Finally, experiments are conducted on the benchmark dataset that demonstrates the superiority of fine-tuned DeBERTa model for cross-domain sentiment classification tasks.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Climate Change Sentiment Analysis Using Domain Specific Bidirectional Encoder Representations From Transformers
    Anoop, V. S.
    Krishnan, T. K. Ajay
    Daud, Ali
    Banjar, Ameen
    Bukhari, Amal
    IEEE ACCESS, 2024, 12 : 114912 - 114922
  • [2] Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
    Areshey, Ali
    Mathkour, Hassan
    SENSORS, 2023, 23 (11)
  • [3] Multimodal Abstractive Summarization using bidirectional encoder representations from transformers with attention mechanism
    Argade, Dakshata
    Khairnar, Vaishali
    Vora, Deepali
    Patil, Shruti
    Kotecha, Ketan
    Alfarhood, Sultan
    HELIYON, 2024, 10 (04)
  • [4] Sentiment Analysis of Turkish Drug Reviews with Bidirectional Encoder Representations from Transformers
    Bozuyla, Mehmet
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (01)
  • [5] Simultaneous Learning of Pivots and Representations for Cross-Domain Sentiment Classification
    Li, Liang
    Ye, Weirui
    Long, Mingsheng
    Tang, Yateng
    Xu, Jin
    Wang, Jianmin
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8220 - 8227
  • [6] Multi-Domain Aspect Extraction Using Bidirectional Encoder Representations From Transformers
    dos Santos, Brucce Neves
    Marcacini, Ricardo Marcondes
    Rezende, Solange Oliveira
    IEEE ACCESS, 2021, 9 : 91604 - 91613
  • [7] Protein Sequence Classification Using Bidirectional Encoder Representations from Transformers (BERT) Approach
    Balamurugan R.
    Mohite S.
    Raja S.P.
    SN Computer Science, 4 (5)
  • [8] Hierarchical Attention Transfer Network for Cross-Domain Sentiment Classification
    Li, Zheng
    Wei, Ying
    Zhang, Yu
    Yang, Qiang
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 5852 - 5859
  • [9] Cross-Domain Sentiment Classification with Attention-Assisted GAN
    Li, Yi-Fan
    Lin, Yu
    Gao, Yang
    Khan, Latifur
    2021 IEEE THIRD INTERNATIONAL CONFERENCE ON COGNITIVE MACHINE INTELLIGENCE (COGMI 2021), 2021, : 88 - 95
  • [10] Interactive Attention Transfer Network for Cross-Domain Sentiment Classification
    Zhang, Kai
    Zhang, Hefu
    Liu, Qi
    Zhao, Hongke
    Zhu, Hengshu
    Chen, Enhong
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5773 - 5780