Multi-task Learning Neural Networks for Comparative Elements Extraction

被引:1
|
作者
Liu, Dianqing [1 ]
Wang, Lihui [1 ]
Shao, Yanqiu [1 ]
机构
[1] Beijing Language & Culture Univ, Sch Informat Sci, Beijing 10083, Peoples R China
来源
基金
中央高校基本科研业务费专项资金资助; 中国国家自然科学基金;
关键词
Comparative elements extraction; Neural networks; BERT-CRF; Multi-task learning; RULES;
D O I
10.1007/978-3-030-81197-6_33
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Comparative sentences are common in human languages. In online comments, a comparative sentence usually contains the subjective attitude or emotional tendency of a reviewer. Hence, comparative elements extraction (CEE) is valuable for opinion mining and sentiment analysis. Most of the existing CEE systems use rule-based or machine learning approaches that need to construct a rule base or spend a huge amount of effort on feature engineering. These approaches usually involve multiple steps, and the performance of each step relies on the accuracy of the previous step, risking error cascading oversteps. In this paper, we adopt a neural network approach to CEE, which supports end-to-end training and automatic learning of sentence representation. Furthermore, considering the high relevance of CEE and comparative sentences recognition (CSR), we propose a multi-task learning model to combine the two tasks, which can further improve the performance of CEE. Experiment results show that both our neural network approach and multi-task learning are effective for CEE.
引用
收藏
页码:398 / 407
页数:10
相关论文
共 50 条
  • [31] Adversarial Multi-task Learning of Deep Neural Networks for Robust Speech Recognition
    Shinohara, Yusuke
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 2369 - 2372
  • [32] Multi-Task Federated Learning for Personalised Deep Neural Networks in Edge Computing
    Mills, Jed
    Hu, Jia
    Min, Geyong
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (03) : 630 - 641
  • [33] Multi-Task Learning for Food Identification and Analysis with Deep Convolutional Neural Networks
    Zhang, Xi-Jin
    Lu, Yi-Fan
    Zhang, Song-Hai
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2016, 31 (03) : 489 - 500
  • [34] Multi-Task Learning for Food Identification and Analysis with Deep Convolutional Neural Networks
    Xi-Jin Zhang
    Yi-Fan Lu
    Song-Hai Zhang
    Journal of Computer Science and Technology, 2016, 31 : 489 - 500
  • [35] Learning rates for multi-task regularization networks
    Gui, Jie
    Zhang, Haizhang
    NEUROCOMPUTING, 2021, 466 : 243 - 251
  • [36] Multi-Task Networks With Universe, Group, and Task Feature Learning
    Pentyala, Shiva
    Liu, Mengwen
    Dreyer, Markus
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 820 - 830
  • [37] Optical multi-task learning using multi-wavelength diffractive deep neural networks
    Duan, Zhengyang
    Chen, Hang
    Lin, Xing
    NANOPHOTONICS, 2023, 12 (05) : 893 - 903
  • [38] A Multi-task Learning Framework for Opinion Triplet Extraction
    Zhang, Chen
    Li, Qiuchi
    Song, Dawei
    Wang, Benyou
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 819 - 828
  • [39] Multi-task neural networks by learned contextual inputs
    Sandnes, Anders T.
    Grimstad, Bjarne
    Kolbjornsen, Odd
    NEURAL NETWORKS, 2024, 179
  • [40] Multi-task neural networks for dealing with missing inputs
    Garcia-Laencina, Pedro J.
    Serrano, Jesus
    Figueiras-Vidal, Anibal R.
    Sancho-Gomez, Jose-Luis
    BIO-INSPIRED MODELING OF COGNITIVE TASKS, PT 1, PROCEEDINGS, 2007, 4527 : 282 - +