Multi-task Learning Neural Networks for Comparative Elements Extraction

被引:1
|
作者
Liu, Dianqing [1 ]
Wang, Lihui [1 ]
Shao, Yanqiu [1 ]
机构
[1] Beijing Language & Culture Univ, Sch Informat Sci, Beijing 10083, Peoples R China
来源
基金
中央高校基本科研业务费专项资金资助; 中国国家自然科学基金;
关键词
Comparative elements extraction; Neural networks; BERT-CRF; Multi-task learning; RULES;
D O I
10.1007/978-3-030-81197-6_33
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Comparative sentences are common in human languages. In online comments, a comparative sentence usually contains the subjective attitude or emotional tendency of a reviewer. Hence, comparative elements extraction (CEE) is valuable for opinion mining and sentiment analysis. Most of the existing CEE systems use rule-based or machine learning approaches that need to construct a rule base or spend a huge amount of effort on feature engineering. These approaches usually involve multiple steps, and the performance of each step relies on the accuracy of the previous step, risking error cascading oversteps. In this paper, we adopt a neural network approach to CEE, which supports end-to-end training and automatic learning of sentence representation. Furthermore, considering the high relevance of CEE and comparative sentences recognition (CSR), we propose a multi-task learning model to combine the two tasks, which can further improve the performance of CEE. Experiment results show that both our neural network approach and multi-task learning are effective for CEE.
引用
收藏
页码:398 / 407
页数:10
相关论文
共 50 条
  • [41] Mortality forecasting via multi-task neural networks
    De Mori, Luca
    Haberman, Steven
    Millossovich, Pietro
    Zhu, Rui
    ASTIN BULLETIN-THE JOURNAL OF THE INTERNATIONAL ACTUARIAL ASSOCIATION, 2025,
  • [42] Integrated Perception with Recurrent Multi-Task Neural Networks
    Bilen, Hakan
    Vedaldi, Andrea
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [43] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [44] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [45] Multi-task Learning for Multilingual Neural Machine Translation
    Wang, Yiren
    Zhai, ChengXiang
    Awadalla, Hany Hassan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1022 - 1034
  • [46] Episodic Multi-Task Learning with Heterogeneous Neural Processes
    Shen, Jiayi
    Zhen, Xiantong
    Wang, Qi
    Worring, Marcel
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [47] Multi-Task and Multi-Domain Learning with Tensor Networks
    Garg, Yash
    Prater-Bennette, Ashley
    Asif, M. Salman
    SIGNAL PROCESSING, SENSOR/INFORMATION FUSION, AND TARGET RECOGNITION XXXII, 2023, 12547
  • [48] Dynamic Multi-Task Learning with Convolutional Neural Network
    Fang, Yuchun
    Ma, Zhengyan
    Zhang, Zhaoxiang
    Zhang, Xu-Yao
    Bai, Xiang
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1668 - 1674
  • [49] Neural Multi-Task Learning for Citation Function and Provenance
    Su, Xuan
    Prasad, Animesh
    Kan, Min-Yen
    Sugiyama, Kazunari
    2019 ACM/IEEE JOINT CONFERENCE ON DIGITAL LIBRARIES (JCDL 2019), 2019, : 394 - 395
  • [50] Empirical evaluation of multi-task learning in deep neural networks for natural language processing
    Jianquan Li
    Xiaokang Liu
    Wenpeng Yin
    Min Yang
    Liqun Ma
    Yaohong Jin
    Neural Computing and Applications, 2021, 33 : 4417 - 4428