Constructing Chinese taxonomy trees from understanding and generative pretrained language models

被引:0
|
作者
Guo, Jianyu [1 ]
Chen, Jingnan [1 ]
Ren, Li [1 ]
Zhou, Huanlai [1 ]
Xu, Wenbo [1 ]
Jia, Haitao [1 ]
机构
[1] University of Electronic Science and Technology of China, Sichuan, ChengDu, China
关键词
Compendex;
D O I
10.7717/PEERJ-CS.2358
中图分类号
学科分类号
摘要
Hypertext systems - Natural language processing systems - Taxonomies - Trees (mathematics)
引用
收藏
相关论文
共 50 条
  • [1] Constructing Chinese taxonomy trees from understanding and generative pretrained language models
    Guo, Jianyu
    Chen, Jingnan
    Ren, Li
    Zhou, Huanlai
    Xu, Wenbo
    Jia, Haitao
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [2] Constructing Taxonomies from Pretrained Language Models
    Chen, Catherine
    Lin, Kevin
    Klein, Dan
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 4687 - 4700
  • [3] TaleBrush: Sketching Stories with Generative Pretrained Language Models
    Chung, John Joon Young
    Kim, Wooseok
    Yoo, Kang Min
    Lee, Hwaran
    Adar, Eytan
    Chang, Minsuk
    PROCEEDINGS OF THE 2022 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI' 22), 2022,
  • [4] Low-resource Taxonomy Enrichment with Pretrained Language Models
    Takeoka, Kunihiro
    Akimoto, Kosuke
    Oyamada, Masafumi
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2747 - 2758
  • [5] Data Augmentation for Spoken Language Understanding via Pretrained Language Models
    Peng, Baolin
    Zhu, Chenguang
    Zeng, Michael
    Gao, Jianfeng
    INTERSPEECH 2021, 2021, : 1219 - 1223
  • [6] Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale
    Hu, Xiang
    Ji, Pengyu
    Zhu, Qingyang
    Wu, Wei
    Tu, Kewei
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 2640 - 2657
  • [7] The Heuristic Core: Understanding Subnetwork Generalization in Pretrained Language Models
    Bhaskar, Adithya
    Chen, Danqi
    Friedman, Dan
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 14351 - 14368
  • [8] Sub-Character Tokenization for Chinese Pretrained Language Models
    Si, Chenglei
    Zhang, Zhengyan
    Chen, Yingfa
    Qi, Fanchao
    Wang, Xiaozhi
    Liu, Zhiyuan
    Wang, Yasheng
    Liu, Qun
    Sun, Maosong
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2023, 11 : 469 - 487
  • [9] Augmenting Slot Values and Contexts for Spoken Language Understanding with Pretrained Models
    Lin, Haitao
    Xiang, Lu
    Zhou, Yu
    Zhang, Jiajun
    Zong, Chengqing
    INTERSPEECH 2021, 2021, : 4703 - 4707
  • [10] Rethinking the Construction of Effective Metrics for Understanding the Mechanisms of Pretrained Language Models
    Li, You
    Yin, Jinhui
    Lin, Yuming
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 13399 - 13412