Knowledge-Grounded Dialogue Generation with Pre-trained Language Models

被引:0
|
作者
Zhao, Xueliang [1 ,2 ]
Wu, Wei [3 ]
Xu, Can [4 ]
Tao, Chongyang [4 ]
Zhao, Dongyan [1 ,2 ]
Yan, Rui [1 ,2 ,5 ]
机构
[1] Peking Univ, Wangxuan Inst Comp Technol, Beijing, Peoples R China
[2] Peking Univ, Ctr Data Sci, AAIS, Beijing, Peoples R China
[3] Meituan, Beijing, Peoples R China
[4] Microsoft Corp, Beijing, Peoples R China
[5] Beijing Acad Artificial Intelligence BAAI, Beijing, Peoples R China
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study knowledge-grounded dialogue generation with pre-trained language models. To leverage the redundant external knowledge under capacity constraint, we propose equipping response generation defined by a pre-trained language model with a knowledge selection module, and an unsupervised approach to jointly optimizing knowledge selection and response generation with unlabeled dialogues. Empirical results on two benchmarks indicate that our model can significantly outperform state-of-the-art methods in both automatic evaluation and human judgment.
引用
收藏
页码:3377 / 3390
页数:14
相关论文
共 50 条
  • [1] Knowledge Grounded Pre-Trained Model For Dialogue Response Generation
    Wang, Yanmeng
    Rong, Wenge
    Zhang, Jianfei
    Ouyang, Yuanxin
    Xiong, Zhang
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [2] XDAI: A Tuning-free Framework for Exploiting Pre-trained Language Models in Knowledge Grounded Dialogue Generation
    Yu, Jifan
    Zhang, Xiaohan
    Xu, Yifan
    Lei, Xuanyu
    Guan, Xinyu
    Zhang, Jing
    Hou, Lei
    Li, Juanzi
    Tang, Jie
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4422 - 4432
  • [3] Knowledge Base Grounded Pre-trained Language Models via Distillation
    Sourty, Raphael
    Moreno, Jose G.
    Servant, Francois-Paul
    Tamine, Lynda
    39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 1617 - 1625
  • [4] Knowledge-Grounded Dialogue Generation with a Unified Knowledge Representation
    Li, Yu
    Peng, Baolin
    Shen, Yelong
    Mao, Yi
    Liden, Lars
    Yu, Zhou
    Gao, Jianfeng
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 206 - 218
  • [5] An Investigation of Suitability of Pre-Trained Language Models for Dialogue Generation - Avoiding Discrepancies
    Zeng, Yan
    Nie, Jian-Yun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4481 - 4494
  • [6] Commonsense Knowledge Reasoning and Generation with Pre-trained Language Models: A Survey
    Bhargava, Prajjwal
    Ng, Vincent
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 12317 - 12325
  • [7] Knowledge Rumination for Pre-trained Language Models
    Yao, Yunzhi
    Wang, Peng
    Mao, Shengyu
    Tan, Chuanqi
    Huang, Fei
    Chen, Huajun
    Zhang, Ningyu
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3387 - 3404
  • [8] Knowledge Inheritance for Pre-trained Language Models
    Qin, Yujia
    Lin, Yankai
    Yi, Jing
    Zhang, Jiajie
    Han, Xu
    Zhang, Zhengyan
    Su, Yusheng
    Liu, Zhiyuan
    Li, Peng
    Sun, Maosong
    Zhou, Jie
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 3921 - 3937
  • [9] Approximation of Response Knowledge Retrieval in Knowledge-grounded Dialogue Generation
    Zheng, Wen
    Milic-Frayling, Natasa
    Zhou, Ke
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020,
  • [10] Probing Pre-Trained Language Models for Disease Knowledge
    Alghanmi, Israa
    Espinosa-Anke, Luis
    Schockaert, Steven
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033