Assisted Process Knowledge Graph Building Using Pre-trained Language Models

被引:1
|
作者
Bellan, Patrizio [1 ,2 ]
Dragoni, Mauro [1 ]
Ghidini, Chiara [1 ]
机构
[1] Fdn Bruno Kessler, Trento, Italy
[2] Free Univ Bozen Bolzano, Bolzano, Italy
关键词
Process extraction from text; In-context learning; Knowledge graph; Pre-trained language model; Business process management;
D O I
10.1007/978-3-031-27181-6_5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The automated construction of knowledge graphs from procedural documents is a challenging research area. Here, the lack of annotated data, as well as raw text repositories describing real-world procedural documents, make it extremely difficult to adopt deep learning approaches. Pre-trained language models have shown promising results concerning the knowledge extraction tasks from the models themselves. Although several works explored this strategy to build knowledge graph, the viability of knowledge base construction by using prompt-based learning strategy from such language models has not yet been investigated deeply. In this work, we present a prompt-based in-context learning strategy to extract, from natural language process descriptions, conceptual information that can be converted into their equivalent knowledge graphs. Such a strategy is performed in a multi-turn dialog fashion. We validate the accuracy of the proposed approach from both quantitative and qualitative perspectives. The results highlight the feasibility of the proposed approach within low-resource scenarios.
引用
收藏
页码:60 / 74
页数:15
相关论文
共 50 条
  • [31] Knowledge-Grounded Dialogue Generation with Pre-trained Language Models
    Zhao, Xueliang
    Wu, Wei
    Xu, Can
    Tao, Chongyang
    Zhao, Dongyan
    Yan, Rui
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 3377 - 3390
  • [32] μBERT: Mutation Testing using Pre-Trained Language Models
    Degiovanni, Renzo
    Papadakis, Mike
    2022 IEEE 15TH INTERNATIONAL CONFERENCE ON SOFTWARE TESTING, VERIFICATION AND VALIDATION WORKSHOPS (ICSTW 2022), 2022, : 160 - 169
  • [33] Devulgarization of Polish Texts Using Pre-trained Language Models
    Klamra, Cezary
    Wojdyga, Grzegorz
    Zurowski, Sebastian
    Rosalska, Paulina
    Kozlowska, Matylda
    Ogrodniczuk, Maciej
    COMPUTATIONAL SCIENCE, ICCS 2022, PT II, 2022, : 49 - 55
  • [34] MERGEDISTILL: Merging Pre-trained Language Models using Distillation
    Khanuja, Simran
    Johnson, Melvin
    Talukdar, Partha
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2874 - 2887
  • [35] Low Resource Summarization using Pre-trained Language Models
    Munaf, Mubashir
    Afzal, Hammad
    Mahmood, Khawir
    Iltaf, Naima
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (10)
  • [36] Issue Report Classification Using Pre-trained Language Models
    Colavito, Giuseppe
    Lanubile, Filippo
    Novielli, Nicole
    2022 IEEE/ACM 1ST INTERNATIONAL WORKSHOP ON NATURAL LANGUAGE-BASED SOFTWARE ENGINEERING (NLBSE 2022), 2022, : 29 - 32
  • [37] Automated Assessment of Inferences Using Pre-Trained Language Models
    Yoo, Yongseok
    APPLIED SCIENCES-BASEL, 2024, 14 (09):
  • [38] Annotating Columns with Pre-trained Language Models
    Suhara, Yoshihiko
    Li, Jinfeng
    Li, Yuliang
    Zhang, Dan
    Demiralp, Cagatay
    Chen, Chen
    Tan, Wang-Chiew
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 1493 - 1503
  • [39] ReasoningLM: Enabling Structural Subgraph Reasoning in Pre-trained Language Models for Question Answering over Knowledge Graph
    Jiang, Jinhao
    Zhou, Kun
    Zhao, Wayne Xin
    Li, Yaliang
    Wen, Ji-Rong
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3721 - 3735
  • [40] K-Bloom: unleashing the power of pre-trained language models in extracting knowledge graph with predefined relations
    Vo, Trung
    Luu, Son T.
    Nguyen, Le-Minh
    KNOWLEDGE AND INFORMATION SYSTEMS, 2025, : 4487 - 4521