Table to text generation with accurate content copying

被引:0
|
作者
Yang Yang
Juan Cao
Yujun Wen
Pengzhou Zhang
机构
[1] Communication University of China,State Key Laboratory of Media Convergence and Communication
来源
Scientific Reports | / 11卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Generating fluent, coherent, and informative text from structured data is called table-to-text generation. Copying words from the table is a common method to solve the “out-of-vocabulary” problem, but it’s difficult to achieve accurate copying. In order to overcome this problem, we invent an auto-regressive framework based on the transformer that combines a copying mechanism and language modeling to generate target texts. Firstly, to make the model better learn the semantic relevance between table and text, we apply a word transformation method, which incorporates the field and position information into the target text to acquire the position of where to copy. Then we propose two auxiliary learning objectives, namely table-text constraint loss and copy loss. Table-text constraint loss is used to effectively model table inputs, whereas copy loss is exploited to precisely copy word fragments from a table. Furthermore, we improve the text search strategy to reduce the probability of generating incoherent and repetitive sentences. The model is verified by experiments on two datasets and better results are obtained than the baseline model. On WIKIBIO, the result is improved from 45.47 to 46.87 on BLEU and from 41.54 to 42.28 on ROUGE. On ROTOWIRE, the result is increased by 4.29% on CO metric, and 1.93 points higher on BLEU.
引用
收藏
相关论文
共 50 条
  • [21] Learning number reasoning for numerical table-to-text generation
    Feng, Xiaocheng
    Gong, Heng
    Chen, Yuyu
    Sun, Yawei
    Qin, Bing
    Bi, Wei
    Liu, Xiaojiang
    Liu, Ting
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (08) : 2269 - 2280
  • [22] Content preserving text generation with attribute controls
    Logeswaran, Lajanugen
    Lee, Honglak
    Bengio, Samy
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [23] LISTING COPYING APPLE TEXT FILES
    OTT, JP
    CREATIVE COMPUTING, 1982, 8 (05): : 144 - 144
  • [24] Plan-then-Seam: Towards Efficient Table-to-Text Generation
    Li, Liang
    Geng, Ruiying
    Fang, Chengyang
    Li, Bing
    Ma, Can
    Li, Binhua
    Li, Yongbin
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 205 - 219
  • [25] Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation
    Wang, Peng
    Lin, Junyang
    Yang, An
    Zhou, Chang
    Zhang, Yichang
    Zhou, Jingren
    Yang, Hongxia
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4831 - 4843
  • [26] Automatic Logical Forms improve fidelity in Table-to-Text generation
    Alonso, Inigo
    Agirre, Eneko
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [27] A Sequence-to-Sequence&Set Model for Text-to-Table Generation
    Li, Tong
    Wang, Zhihao
    Shao, Liangying
    Zheng, Xuling
    Wang, Xiaoli
    Su, Jinsong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5358 - 5370
  • [28] Enhancing diversity for logical table-to-text generation with mixture of experts
    Wu, Jie
    Hou, Mengshu
    EXPERT SYSTEMS, 2024, 41 (04)
  • [29] Few-Shot Table-to-Text Generation with Prototype Memory
    Su, Yixuan
    Meng, Zaiqiao
    Baker, Simon
    Collier, Nigel
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 910 - 917
  • [30] How Helpful is Inverse Reinforcement Learning for Table-to-Text Generation?
    Ghosh, Sayan
    Qi, Zheng
    Chaturvedi, Snigdha
    Srivastava, Shashank
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 71 - 79