Power text information extraction based on multi-task learning

被引:0
|
作者
Ji, Xin [1 ,2 ]
Wu, Tongxin [1 ]
Yu, Ting [1 ]
Dong, Linxiao [1 ]
Chen, Yiting [1 ]
Mi, Na [1 ]
Zhao, Jiakui [1 ]
机构
[1] Big Data Center of State Grid Corporation of China, Beijing,100031, China
[2] School of Computer Science and Engineering, Beihang University, Beijing,100191, China
关键词
Power distribution faults;
D O I
10.13700/j.bh.1001-5965.2022.0683
中图分类号
学科分类号
摘要
In order to improve the analysis and processing speed of power system fault text in actual business scenarios, a power fault text information extraction model based on pre-training and multi-task learning was proposed. The pre-training model was used to learn the context information of power text words. The first-order and second-order fusion features of words were mined, which enhanced the representation ability of features. The multi-task learning framework was used to combine named entity recognition and relation extraction, which realized the mutual supplement and mutual promotion of entity recognition and relationship extraction, so as to improve the performance of power fault text information extraction. The model was verified by the daily business data of a power data center. Compared with other models, the proposed model’s accuracy and recall of power fault text entity recognition and relation extraction were improved. © 2024 Beijing University of Aeronautics and Astronautics (BUAA). All rights reserved.
引用
收藏
页码:2461 / 2469
相关论文
共 50 条
  • [31] Ask the GRU: Multi-task Learning for Deep Text Recommendations
    Bansal, Trapit
    Belanger, David
    McCallum, Andrew
    PROCEEDINGS OF THE 10TH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS'16), 2016, : 107 - 114
  • [32] Multi-task Learning with Bidirectional Language Models for Text Classification
    Yang, Qi
    Shang, Lin
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [33] Multi-task learning for historical text normalization: Size matters
    Bollmann, Marcel
    Sogaard, Anders
    Bingel, Joachim
    DEEP LEARNING APPROACHES FOR LOW-RESOURCE NATURAL LANGUAGE PROCESSING (DEEPLO), 2018, : 19 - 24
  • [34] Multi-task learning using a hybrid representation for text classification
    Guangquan Lu
    Jiangzhang Gan
    Jian Yin
    Zhiping Luo
    Bo Li
    Xishun Zhao
    Neural Computing and Applications, 2020, 32 : 6467 - 6480
  • [35] CoTexT: Multi-task Learning with Code-Text Transformer
    Long Phan
    Hieu Tran
    Le, Daniel
    Hieu Nguyen
    Anibal, James
    Peltekian, Alec
    Ye, Yanfang
    NLP4PROG 2021: THE 1ST WORKSHOP ON NATURAL LANGUAGE PROCESSING FOR PROGRAMMING (NLP4PROG 2021), 2021, : 40 - 47
  • [36] Multi-task learning using a hybrid representation for text classification
    Lu, Guangquan
    Gan, Jiangzhang
    Yin, Jian
    Luo, Zhiping
    Li, Bo
    Zhao, Xishun
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (11): : 6467 - 6480
  • [37] Fact Aware Multi-task Learning for Text Coherence Modeling
    Abhishek, Tushar
    Rawat, Daksh
    Gupta, Manish
    Varma, Vasudeva
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT II, 2022, 13281 : 340 - 353
  • [38] Multi-Task Learning for Text-dependent Speaker Verification
    Chen, Nanxin
    Qian, Yanmin
    Yu, Kai
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 185 - 189
  • [39] An Information-Theoretic Approach for Multi-task Learning
    Yang, Pei
    Tan, Qi
    Xu, Hao
    Ding, Yehua
    ADVANCED DATA MINING AND APPLICATIONS, PROCEEDINGS, 2009, 5678 : 386 - 396
  • [40] Multi-node load forecasting based on multi-task learning with modal feature extraction
    Tan, Mao
    Hu, Chenglin
    Chen, Jie
    Wang, Ling
    Li, Zhengmao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022, 112