Enabling controllable table-to-text generation via prompting large language models with guided planning

被引:1
|
作者
Zhao, Shuo [1 ]
Sun, Xin [1 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci, Beijing 100081, Peoples R China
基金
国家重点研发计划;
关键词
Large language models; Controllable text generation; Few-shot table-to-text generation;
D O I
10.1016/j.knosys.2024.112571
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, Large Language Models (LLMs) has demonstrated unparalleled capabilities in understanding and generation, hence holding promising prospects for applying LLMs to table-to-text generation. However, the generation process with LLMs lacks a high degree of controllability, which hinders the utilization of LLMs for table-to-text generation. In this paper, we introduce Poised, an effective method that prompts LLMs with guided planning to achieve controllable table-to-text generation. Specifically, we first employ prefix-tuning on BART to derive a plan from the given table. Then, we combine the plan with guided instructions to create a comprehensive prompt, which is later input into LLMs to generate the description of the table. Experiments across three domains of the few-shot Wili dataset show that Poised achieves or approaches a plan completion rate of 100%, with an average hallucination frequency of less than 10%. Furthermore, Poised allows for finegrained control over the generated content by intentionally modifying the prompt, enabling precise control over aspects such as attribute realization order.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Leveraging Large Language Models for Flexible and Robust Table-to-Text Generation
    Oro, Ermelinda
    De Grandis, Luca
    Granata, Francesco Maria
    Ruffolo, Massimo
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, PT I, DEXA 2024, 2024, 14910 : 222 - 227
  • [2] Table-to-Text Generation With Pretrained Diffusion Models
    Krylov, Aleksei S.
    Somov, Oleg D.
    IEEE ACCESS, 2024, 12 : 110517 - 110525
  • [3] Enhancing Content Planning for Table-to-Text Generation with Data Understanding and Verification
    Gong, Heng
    Bi, Wei
    Feng, Xiaocheng
    Qin, Bing
    Liu, Xiaojiang
    Liu, Ting
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020,
  • [4] Classifiers Guided Controllable Text Generation for Discrete Diffusion Language Models
    Jiang, Hang
    Cai, Guoyong
    Li, Sihui
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT III, NLPCC 2024, 2025, 15361 : 132 - 144
  • [5] Table-to-Text Generation via Row-Aware Hierarchical Encoder
    Gong, Heng
    Feng, Xiaocheng
    Qin, Bing
    Liu, Ting
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 533 - 544
  • [6] Controllable Generation from Pre-trained Language Models via Inverse Prompting
    Zou, Xu
    Yin, Da
    Zhong, Qingyang
    Yang, Hongxia
    Yang, Zhilin
    Tang, Jie
    KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2450 - 2460
  • [7] LOFT: Enhancing Faithfulness and Diversity for Table-to-Text Generation via Logic Form Control
    Zhao, Yilun
    Qi, Zhenting
    Nan, Linyong
    Flores, Lorenzo Jaime Yu
    Radev, Dragomir
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 554 - 561
  • [8] Enabling Large Language Models to Generate Text with Citations
    Gao, Tianyu
    Yen, Howard
    Yu, Jiatong
    Chen, Danqi
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 6465 - 6488
  • [9] Knowledge-Infused Prompting: Assessing and Advancing Clinical Text Data Generation with Large Language Models
    Xu, Ran
    Cui, Hejie
    Yu, Yue
    Kan, Xuan
    Shi, Wenqi
    Zhuang, Yuchen
    Wang, May D.
    Jin, Wei
    Ho, Joyce C.
    Yang, Carl
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 15496 - 15523
  • [10] Automatic Lesson Plan Generation via Large Language Models with Self-critique Prompting
    Zheng, Ying
    Li, Xueyi
    Huang, Yaying
    Liang, Qianru
    Guo, Teng
    Hou, Mingliang
    Gao, Boyu
    Tian, Mi
    Liu, Zitao
    Luo, Weiqi
    ARTIFICIAL INTELLIGENCE IN EDUCATION: POSTERS AND LATE BREAKING RESULTS, WORKSHOPS AND TUTORIALS, INDUSTRY AND INNOVATION TRACKS, PRACTITIONERS, DOCTORAL CONSORTIUM AND BLUE SKY, AIED 2024, PT I, 2024, 2150 : 163 - 178