Robust (Controlled) Table-to-Text Generation with Structure-Aware Equivariance Learning

被引:0
|
作者
Wang, Fei [1 ]
Xu, Zhewei
Szekely, Pedro
Chen, Muhao
机构
[1] Univ Southern Calif, Dept Comp Sci, Los Angeles, CA 90007 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Controlled table-to-text generation seeks to generate natural language descriptions for highlighted subparts of a table. Previous SOTA systems still employ a sequence-to-sequence generation method, which merely captures the table as a linear structure and is brittle when table layouts change. We seek to go beyond this paradigm by (1) effectively expressing the relations of content pieces in the table, and (2) making our model robust to content-invariant structural transformations. Accordingly, we propose an equivariance learning framework, LATTICE (), which encodes tables with a structure-aware self-attention mechanism. This prunes the full self-attention structure into an order-invariant graph attention that captures the connected graph structure of cells belonging to the same row or column, and it differentiates between relevant cells and irrelevant cells from the structural perspective. Our framework also modifies the positional encoding mechanism to preserve the relative position of tokens in the same cell but enforce position invariance among different cells. Our technology is free to be plugged into existing table-to-text generation models, and has improved T5-based models to offer better performance on ToTTo and HiTab. Moreover, on a harder version of ToTTo, we preserve promising performance, while previous SOTA systems, even with transformationbased data augmentation, have seen significant performance drops.(1)
引用
收藏
页码:5037 / 5048
页数:12
相关论文
共 50 条
  • [1] Table-to-Text Generation by Structure-Aware Seq2seq Learning
    Liu, Tianyu
    Wang, Kexiang
    Sha, Lei
    Chang, Baobao
    Sui, Zhifang
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4881 - 4888
  • [2] Structure-Aware Pre-Training for Table-to-Text Generation
    Xing, Xinyu
    Wan, Xiaojun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2273 - 2278
  • [3] Caption Alignment and Structure-Aware Attention for Scientific Table-to-Text Generation
    Wu, Jian
    Karlsson, Borje F.
    Okumura, Manabu
    IEEE ACCESS, 2024, 12 : 190540 - 190553
  • [4] ToTTo: A Controlled Table-To-Text Generation Dataset
    Parikh, Ankur P.
    Wang, Xuezhi
    Gehrmann, Sebastian
    Faruqui, Manaal
    Dhingra, Bhuwan
    Yang, Diyi
    Das, Dipanjan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1173 - 1186
  • [5] Improving User Controlled Table-To-Text Generation Robustness
    Hu, Hanxu
    Liu, Yunqing
    Yu, Zhongyi
    Perez-Beltrachini, Laura
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2317 - 2324
  • [6] Learning number reasoning for numerical table-to-text generation
    Feng, Xiaocheng
    Gong, Heng
    Chen, Yuyu
    Sun, Yawei
    Qin, Bing
    Bi, Wei
    Liu, Xiaojiang
    Liu, Ting
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (08) : 2269 - 2280
  • [7] Learning number reasoning for numerical table-to-text generation
    Feng, Xiaocheng
    Gong, Heng
    Chen, Yuyu
    Sun, Yawei
    Qin, Bing
    Bi, Wei
    Liu, Xiaojiang
    Liu, Ting
    International Journal of Machine Learning and Cybernetics, 2021, 12 (08): : 2269 - 2280
  • [8] Learning number reasoning for numerical table-to-text generation
    Xiaocheng Feng
    Heng Gong
    Yuyu Chen
    Yawei Sun
    Bing Qin
    Wei Bi
    Xiaojiang Liu
    Ting Liu
    International Journal of Machine Learning and Cybernetics, 2021, 12 : 2269 - 2280
  • [9] Table-to-Text Generation via Row-Aware Hierarchical Encoder
    Gong, Heng
    Feng, Xiaocheng
    Qin, Bing
    Liu, Ting
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 533 - 544
  • [10] Leveraging Large Language Models for Flexible and Robust Table-to-Text Generation
    Oro, Ermelinda
    De Grandis, Luca
    Granata, Francesco Maria
    Ruffolo, Massimo
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, PT I, DEXA 2024, 2024, 14910 : 222 - 227