Domain Adaptation of Transformer-Based Models Using Unlabeled Data for Relevance and Polarity Classification of German Customer Feedback

被引:0
|
作者
Idrissi-Yaghir A. [1 ,3 ]
Schäfer H. [1 ,2 ]
Bauer N. [1 ]
Friedrich C.M. [1 ,3 ]
机构
[1] Department of Computer Science, University of Applied Sciences and Arts Dortmund (FHDO), Emil-Figge Str. 42, Dortmund
[2] Institute for Transfusion Medicine, University Hospital Essen, Hufelandstraße 55, Essen
[3] Institute for Medical Informatics, Biometry and Epidemiology (IMIBE), University Hospital Essen, Hufelandstraße 55, Essen
关键词
Domain adaptation; Sentiment analysis; Text classification; Transformer-based models;
D O I
10.1007/s42979-022-01563-6
中图分类号
学科分类号
摘要
Understanding customer feedback is becoming a necessity for companies to identify problems and improve their products and services. Text classification and sentiment analysis can play a major role in analyzing this data by using a variety of machine and deep learning approaches. In this work, different transformer-based models are utilized to explore how efficient these models are when working with a German customer feedback dataset. In addition, these pre-trained models are further analyzed to determine if adapting them to a specific domain using unlabeled data can yield better results than off-the-shelf pre-trained models. To evaluate the models, two downstream tasks from the GermEval 2017 are considered. The experimental results show that transformer-based models can reach significant improvements compared to a fastText baseline and outperform the published scores and previous models. For the subtask Relevance Classification, the best models achieve a micro-averaged F1-Score of 96.1 % on the first test set and 95.9 % on the second one, and a score of 85.1 % and 85.3 % for the subtask Polarity Classification. © 2023, The Author(s).
引用
收藏
相关论文
共 22 条
  • [11] Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
    Agbesi, Victor Kwaku
    Chen, Wenyu
    Yussif, Sophyani Banaamwini
    Hossin, Md Altab
    Ukwuoma, Chiagoziem C.
    Kuadey, Noble A.
    Agbesi, Colin Collinson
    Samee, Nagwan Abdel
    Jamjoom, Mona M.
    Al-antari, Mugahed A.
    SYSTEMS, 2024, 12 (01):
  • [12] Transformer-Based Water Stress Estimation Using Leaf Wilting Computed from Leaf Images and Unsupervised Domain Adaptation for Tomato Crops
    Koike, Makoto
    Onuma, Riku
    Adachi, Ryo
    Mineno, Hiroshi
    TECHNOLOGIES, 2024, 12 (07)
  • [13] Transfer Learning with Transformer-Based Models for Mine Water Inrush Prediction: A Multivariate Analysis Using Sparse and Imbalanced Monitoring Data
    Yin, Huichao
    Zhang, Gaizhuo
    Wu, Qiang
    Cui, Fangpeng
    Yan, Bicheng
    Yin, Shangxian
    Soltanian, Mohamad Reza
    Thanh, Hung Vo
    Dai, Zhenxue
    MINE WATER AND THE ENVIRONMENT, 2024,
  • [14] Tesla at SemEval-2022 Task 4: Patronizing and Condescending Language Detection using Transformer-based Models with Data Augmentation
    Bhatt, Sahil Manoj
    Shrivastava, Manish
    PROCEEDINGS OF THE 16TH INTERNATIONAL WORKSHOP ON SEMANTIC EVALUATION, SEMEVAL-2022, 2022, : 394 - 399
  • [15] High impedance fault classification in microgrids using a transformer-based model with time series harmonic synchrophasors under data quality issues
    Cieslak D.A.G.
    Moreto M.
    Lazzaretti A.E.
    Macedo-Júnior J.R.
    Neural Computing and Applications, 2024, 36 (23) : 14017 - 14034
  • [16] Integrating structured and unstructured data for predicting emergency severity: an association and predictive study using transformer-based natural language processing models
    Zhang, Xingyu
    Wang, Yanshan
    Jiang, Yun
    Pacella, Charissa B.
    Zhang, Wenbin
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2024, 24 (01)
  • [17] Comparison of pretrained transformer-based models for influenza and COVID-19 detection using social media text data in Saskatchewan, Canada
    Tian, Yuan
    Zhang, Wenjing
    Duan, Lujie
    McDonald, Wade
    Osgood, Nathaniel
    FRONTIERS IN DIGITAL HEALTH, 2023, 5
  • [18] Continual Learning of a Transformer-Based Deep Learning Classifier Using an Initial Model from Action Observation EEG Data to Online Motor Imagery Classification
    Lee, Po-Lei
    Chen, Sheng-Hao
    Chang, Tzu-Chien
    Lee, Wei-Kung
    Hsu, Hao-Teng
    Chang, Hsiao-Huang
    BIOENGINEERING-BASEL, 2023, 10 (02):
  • [19] Full-field temperature prediction in tunnel fires using limited monitored ceiling flow temperature data with transformer-based deep learning models
    Guo, Xin
    Yang, Dong
    Jiang, Li
    Du, Tao
    Lyu, Shan
    FIRE SAFETY JOURNAL, 2024, 148
  • [20] TECHSSN1 at SemEval-2024 Task 10: Emotion Classification in Hindi-English Code-Mixed Dialogue using Transformer-based Models
    Yenumulapalli, Venkatasai Ojus
    Premnath, Pooja
    Mohankumar, Parthiban
    Sivanaiah, Rajalakshmi
    Suseelan, Angel Deborah
    PROCEEDINGS OF THE 18TH INTERNATIONAL WORKSHOP ON SEMANTIC EVALUATION, SEMEVAL-2024, 2024, : 833 - 838