Editorial for Special Issue on Pre-trained Large Language Models for Information Processing

被引:0
|
作者
Wang, Bin [1 ]
Kawahara, Tatsuya [2 ]
Li, Haizhou [3 ]
Meng, Helen [4 ]
Wu, Chung-Hsien [5 ]
机构
[1] ASTAR, Inst Infocomm Res, Singapore, Singapore
[2] Kyoto Univ, Kyoto, Japan
[3] Chinese Univ Hong Kong, Shenzhen, Peoples R China
[4] Chinese Univ Hong Kong, Hong Kong, Peoples R China
[5] Natl Cheng Kung Univ, Tainan, Taiwan
关键词
D O I
10.1561/116.00004100
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
引用
收藏
页数:3
相关论文
共 50 条
  • [41] Grounding Ontologies with Pre-Trained Large Language Models for Activity Based Intelligence
    Azim, Anee
    Clark, Leon
    Lau, Caleb
    Cobb, Miles
    Jenner, Kendall
    SIGNAL PROCESSING, SENSOR/INFORMATION FUSION, AND TARGET RECOGNITION XXXIII, 2024, 13057
  • [42] Pre-Trained Models for Search and Recommendation: Introduction to the Special Issue-Part 1
    Wang, Wenjie
    Liu, Zheng
    Feng, Fuli
    Dou, Zhicheng
    Ai, Qingyao
    Yang, Grace Hui
    Lian, Defu
    Hou, Lu
    Sun, Aixin
    Zamani, Hamed
    Metzler, Donald
    de Rijke, Maarten
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2025, 43 (02)
  • [43] The Use and Misuse of Pre-Trained Generative Large Language Models in Reliability Engineering
    Hu, Yunwei
    Goktas, Yavuz
    Yellamati, David Deepak
    De Tassigny, Catherine
    2024 ANNUAL RELIABILITY AND MAINTAINABILITY SYMPOSIUM, RAMS, 2024,
  • [44] Impact of data quality for automatic issue classification using pre-trained language models
    Colavito, Giuseppe
    Lanubile, Filippo
    Novielli, Nicole
    Quaranta, Luigi
    JOURNAL OF SYSTEMS AND SOFTWARE, 2024, 210
  • [45] From Cloze to Comprehension: Retrofitting Pre-trained Masked Language Models to Pre-trained Machine Reader
    Xu, Weiwen
    Li, Xin
    Zhang, Wenxuan
    Zhou, Meng
    Lam, Wai
    Si, Luo
    Bing, Lidong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [46] Probing Pre-Trained Language Models for Disease Knowledge
    Alghanmi, Israa
    Espinosa-Anke, Luis
    Schockaert, Steven
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033
  • [47] Analyzing Individual Neurons in Pre-trained Language Models
    Durrani, Nadir
    Sajjad, Hassan
    Dalvi, Fahim
    Belinkov, Yonatan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4865 - 4880
  • [48] Emotional Paraphrasing Using Pre-trained Language Models
    Casas, Jacky
    Torche, Samuel
    Daher, Karl
    Mugellini, Elena
    Abou Khaled, Omar
    2021 9TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2021,
  • [49] Dynamic Knowledge Distillation for Pre-trained Language Models
    Li, Lei
    Lin, Yankai
    Ren, Shuhuai
    Li, Peng
    Zhou, Jie
    Sun, Xu
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 379 - 389
  • [50] Prompt Tuning for Discriminative Pre-trained Language Models
    Yao, Yuan
    Dong, Bowen
    Zhang, Ao
    Zhang, Zhengyan
    Xie, Ruobing
    Liu, Zhiyuan
    Lin, Leyu
    Sun, Maosong
    Wang, Jianyong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3468 - 3473