Detecting Sarcasm in Conversation Context Using Transformer-Based Models

被引:0
|
作者
Avvaru, Adithya [1 ,2 ]
Vobilisetty, Sanath [2 ]
Mamidi, Radhika [1 ]
机构
[1] Int Inst Informat Technol, Hyderabad, India
[2] Teradata India Pvt Ltd, Mumbai, Maharashtra, India
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sarcasm detection, regarded as one of the subproblems of sentiment analysis, is a very typical task because the introduction of sarcastic words can flip the sentiment of the sentence itself. To date, many research works revolve around detecting sarcasm in one single sentence and there is very limited research to detect sarcasm resulting from multiple sentences. Current models used Long Short Term Memory (Hochreiter and Schmidhuber, 1997) (LSTM) variants with or without attention to detect sarcasm in conversations. We showed that the models using state-of-the-art Bidirectional Encoder Representations from Transformers (Devlin et al., 2018) (BERT), to capture syntactic and semantic information across conversation sentences, performed better than the current models. Based on the data analysis, we estimated that the number of sentences in the conversation that can contribute to the sarcasm and the results agrees to this estimation. We also perform a comparative study of our different versions of BERT-based model with other variants of LSTM model and XLNet (Yang et al, 2019) (both using the estimated number of conversation sentences) and find out that BERT-based models outperformed them.
引用
收藏
页码:98 / 103
页数:6
相关论文
共 50 条
  • [21] On Robustness of Finetuned Transformer-based NLP Models
    Neerudu, Pavan Kalyan Reddy
    Oota, Subba Reddy
    Marreddy, Mounika
    Kagita, Venkateswara Rao
    Gupta, Manish
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 7180 - 7195
  • [22] TransInpaint: Transformer-based Image Inpainting with Context Adaptation
    Shamsolmoali, Pourya
    Zareapoor, Masoumeh
    Granger, Eric
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 849 - 858
  • [23] Influence of Context in Transformer-Based Medication Relation Extraction
    Modersohn, Luise
    Hahn, Udo
    MEDINFO 2023 - THE FUTURE IS ACCESSIBLE, 2024, 310 : 669 - 673
  • [24] Adaptation of Transformer-Based Models for Depression Detection
    Adebanji, Olaronke O.
    Ojo, Olumide E.
    Calvo, Hiram
    Gelbukh, Irina
    Sidorov, Grigori
    COMPUTACION Y SISTEMAS, 2024, 28 (01): : 151 - 165
  • [25] Transformer-based Multi-Party Conversation Generation using Dialogue Discourse Acts Planning
    Chernyavskiy, Alexander
    Ilvovsky, Dmitry
    24TH MEETING OF THE SPECIAL INTEREST GROUP ON DISCOURSE AND DIALOGUE, SIGDIAL 2023, 2023, : 519 - 529
  • [26] The Role of Conversation Context for Sarcasm Detection in Online Interactions
    Ghosh, Debanjan
    Fabbri, Alexander Richard
    Muresan, Smaranda
    18TH ANNUAL MEETING OF THE SPECIAL INTEREST GROUP ON DISCOURSE AND DIALOGUE (SIGDIAL 2017), 2017, : 186 - 196
  • [27] Augmenting Data for Sarcasm Detection with Unlabeled Conversation Context
    Lee, Hankyol
    Yu, Youngjae
    Kim, Gunhee
    FIGURATIVE LANGUAGE PROCESSING, 2020, : 12 - 17
  • [28] Transformer-Based Deep Learning for Sarcasm Detection with Imbalanced Dataset: Resampling Techniques with Downsampling and Augmentation
    Abdullah, Malak
    Khrais, Jumana
    Swedat, Safa
    2022 13TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION SYSTEMS (ICICS), 2022, : 294 - 300
  • [29] Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings
    Prakash, Prafull
    Shashidhar, Saurabh Kumar
    Zhao, Wenlong
    Rongali, Subendhu
    Khan, Haidar
    Kayser, Michael
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4711 - 4717
  • [30] Object detection using convolutional neural networks and transformer-based models: a review
    Shrishti Shah
    Jitendra Tembhurne
    Journal of Electrical Systems and Information Technology, 10 (1)