Long and Diverse Text Generation with Planning-based Hierarchical Variational Model

被引:0
|
作者
Shao, Zhihong [1 ,2 ,3 ]
Huang, Minlie [1 ,2 ,3 ]
Wen, Jiangtao [1 ,2 ,3 ]
Xu, Wenfei [4 ]
Zhu, Xiaoyan [1 ,2 ,3 ]
机构
[1] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Inst Artificial Intelligence, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Beijing 100084, Peoples R China
[3] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[4] Baozun, Shanghai, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
NATURAL-LANGUAGE GENERATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing neural methods for data-to-text generation are still struggling to produce long and diverse texts: they are insufficient to model input data dynamically during generation, to capture inter-sentence coherence, or to generate diversified expressions. To address these issues, we propose a Planning-based Hierarchical Variational Model (PHVM). Our model first plans a sequence of groups (each group is a subset of input items to be covered by a sentence) and then realizes each sentence conditioned on the planning result and the previously generated context, thereby decomposing long text generation into dependent sentence generation sub-tasks. To capture expression diversity, we devise a hierarchical latent structure where a global planning latent variable models the diversity of reasonable planning and a sequence of local latent variables controls sentence realization. Experiments show that our model outperforms state-of-the-art baselines in long and diverse text generation.
引用
收藏
页码:3257 / 3268
页数:12
相关论文
共 50 条
  • [1] A Transformer-Based Hierarchical Variational AutoEncoder Combined Hidden Markov Model for Long Text Generation
    Zhao, Kun
    Ding, Hongwei
    Ye, Kai
    Cui, Xiaohui
    ENTROPY, 2021, 23 (10)
  • [2] Hierarchical planning-based crowd formation
    Liu, Na
    Wang, Xingce
    Liu, Shaolong
    Wu, Zhongke
    He, Jiale
    Cheng, Peng
    Miao, Chunyan
    Thalmann, Nadia Magnenat
    COMPUTER ANIMATION AND VIRTUAL WORLDS, 2019, 30 (06)
  • [3] A Goal-Based Model of Personality for Planning-Based Narrative Generation
    Bahamon, Julio Cesar
    Barot, Camille
    Young, R. Michael
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 4142 - 4143
  • [4] Planning-Based Models of Natural Language Generation
    Garoufi, Konstantina
    LANGUAGE AND LINGUISTICS COMPASS, 2014, 8 (01): : 1 - 10
  • [5] Planning-Based Narrative Generation in Simulated Game Universes
    Chang, Hsueh-Min
    Soo, Von-Wun
    IEEE TRANSACTIONS ON COMPUTATIONAL INTELLIGENCE AND AI IN GAMES, 2009, 1 (03) : 200 - 213
  • [6] Learning Hierarchical Planning-Based Policies from Offline Data
    Woehlke, Jan
    Schmitt, Felix
    van Hoof, Herke
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT IV, 2023, 14172 : 489 - 505
  • [7] AI planning-based approach of attack graph generation
    Chen, Feng
    Su, Jin-Shu
    Han, Wen-Bao
    Jiefangjun Ligong Daxue Xuebao/Journal of PLA University of Science and Technology (Natural Science Edition), 2008, 9 (05): : 460 - 465
  • [8] Hierarchical Text Generation and Planning for Strategic Dialogue
    Yarats, Denis
    Lewis, Mike
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [9] Data-to-text Generation with Variational Sequential Planning
    Puduppully, Ratish
    Fu, Yao
    Lapata, Mirella
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2022, 10 : 697 - 715
  • [10] A Latent Variable Model with Hierarchical Structure and GPT-2 for Long Text Generation
    Zhao, Kun
    Ding, Hongwei
    Ye, Kai
    Cui, Xiaohui
    Fu, Zhongwang
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 297 - 308