Long and Diverse Text Generation with Planning-based Hierarchical Variational Model

被引:0
|
作者
Shao, Zhihong [1 ,2 ,3 ]
Huang, Minlie [1 ,2 ,3 ]
Wen, Jiangtao [1 ,2 ,3 ]
Xu, Wenfei [4 ]
Zhu, Xiaoyan [1 ,2 ,3 ]
机构
[1] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Inst Artificial Intelligence, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Beijing 100084, Peoples R China
[3] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[4] Baozun, Shanghai, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
NATURAL-LANGUAGE GENERATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing neural methods for data-to-text generation are still struggling to produce long and diverse texts: they are insufficient to model input data dynamically during generation, to capture inter-sentence coherence, or to generate diversified expressions. To address these issues, we propose a Planning-based Hierarchical Variational Model (PHVM). Our model first plans a sequence of groups (each group is a subset of input items to be covered by a sentence) and then realizes each sentence conditioned on the planning result and the previously generated context, thereby decomposing long text generation into dependent sentence generation sub-tasks. To capture expression diversity, we devise a hierarchical latent structure where a global planning latent variable models the diversity of reasonable planning and a sequence of local latent variables controls sentence realization. Experiments show that our model outperforms state-of-the-art baselines in long and diverse text generation.
引用
收藏
页码:3257 / 3268
页数:12
相关论文
共 50 条
  • [41] PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation
    Hua, Xinyu
    Wang, Lu
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 781 - 793
  • [42] PLANET: Dynamic Content Planning in Autoregressive Transformers for Long-form Text Generation
    Hu, Zhe
    Chan, Hou Pong
    Liu, Jiachen
    Xiao, Xinyan
    Wu, Hua
    Huang, Lifu
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 2288 - 2305
  • [43] Knowledge representation model for sentence planning module in multilingual text generation system
    Yin, Ling
    Zhang, Dongmo
    Jisuanji Gongcheng/Computer Engineering, 2000, 26 (03): : 3 - 5
  • [44] A text image generation model based on deep learning
    Wang, Jing
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (03) : 4979 - 4989
  • [45] Model-based long-term electricity generation system planning under uncertainty
    Sun, Ninghong
    Ellersdorfer, Ingo
    Swider, Derk J.
    2008 THIRD INTERNATIONAL CONFERENCE ON ELECTRIC UTILITY DEREGULATION AND RESTRUCTURING AND POWER TECHNOLOGIES, VOLS 1-6, 2008, : 1298 - 1304
  • [46] TILGAN: Transformer-based Implicit Latent GAN for Diverse and Coherent Text Generation
    Diao, Shizhe
    Shen, Xinwei
    Shum, KaShun
    Song, Yan
    Zhang, Tong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4844 - 4858
  • [47] HBert: A Long Text Processing Method Based on BERT and Hierarchical Attention Mechanisms
    Lv, Xueqiang
    Liu, Zhaonan
    Zhao, Ying
    Xu, Ge
    You, Xindong
    INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS, 2023, 19 (01)
  • [48] Diverse and Aligned Audio-to-Video Generation via Text-to-Video Model Adaptation
    Yariv, Guy
    Gat, Itai
    Benaim, Sagie
    Wolf, Lior
    Schwartz, Idan
    Adi, Yossi
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 7, 2024, : 6639 - 6647
  • [49] Generation Expansion Planning Model Supporting Diverse Environmental Policies for Reduction of Greenhouse Gases
    Lee, Jeong-In
    Lee, Il-Woo
    Kim, Bal-Ho
    ETRI JOURNAL, 2015, 37 (02) : 295 - 305
  • [50] A Case-Based Approach for Content Planning in Data-to-Text Generation
    Upadhyay, Ashish
    Massie, Stewart
    CASE-BASED REASONING RESEARCH AND DEVELOPMENT, ICCBR 2022, 2022, 13405 : 380 - 394