RecGPT: Generative Pre-training for Text-based Recommendation

被引:0
|
作者
Mang Ngo [1 ]
Dat Quoc Nguyen [1 ]
机构
[1] VinAI Res, Ho Chi Minh City, Vietnam
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present the first domain-adapted and fully-trained large language model, RecGPT-7B, and its instruction-following variant, RecGPT-7B-Instruct, for text-based recommendation. Experimental results on rating prediction and sequential recommendation tasks show that our model, RecGPT-7B-Instruct, outperforms previous strong baselines. We are releasing our RecGPT models as well as their pretraining and fine-tuning datasets to facilitate future research and downstream applications in text-based recommendation. Public "hugging-face" links to our RecGPT models and datasets are available at: https://github.com/VinAIResearch/RecGPT.
引用
收藏
页码:302 / 313
页数:12
相关论文
共 50 条
  • [11] Learning Visual Prior via Generative Pre-Training
    Xie, Jinheng
    Ye, Kai
    Li, Yudong
    Li, Yuexiang
    Lin, Kevin Qinghong
    Zheng, Yefeng
    Shen, Linlin
    Shou, Mike Zheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [12] NatGen: Generative Pre-training by "Naturalizing" Source Code
    Chakraborty, Saikat
    Ahmed, Toufique
    Ding, Yangruibo
    Devanbu, Premkumar T.
    Ray, Baishakhi
    PROCEEDINGS OF THE 30TH ACM JOINT MEETING EUROPEAN SOFTWARE ENGINEERING CONFERENCE AND SYMPOSIUM ON THE FOUNDATIONS OF SOFTWARE ENGINEERING, ESEC/FSE 2022, 2022, : 18 - 30
  • [13] Pre-training of Graph Augmented Transformers for Medication Recommendation
    Shang, Junyuan
    Ma, Tengfei
    Xiao, Cao
    Sun, Jimeng
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5953 - 5959
  • [14] Multi-Modal Contrastive Pre-training for Recommendation
    Liu, Zhuang
    Ma, Yunpu
    Schubert, Matthias
    Ouyang, Yuanxin
    Xiong, Zhang
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2022, 2022, : 99 - 108
  • [15] Denoising based Sequence-to-Sequence Pre-training for Text Generation
    Wang, Liang
    Zhao, Wei
    Jia, Ruoyu
    Li, Sujian
    Liu, Jingming
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4003 - 4015
  • [16] Graph Neural Pre-training for Recommendation with Side Information
    Liu, Siwei
    Meng, Zaiqiao
    Macdonald, Craig
    Ounis, Iadh
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (03)
  • [17] TNT: Text Normalization based Pre-training of Transformers for Content Moderation
    Tan, Fei
    Hu, Yifan
    Hu, Changwei
    Li, Keqian
    Yen, Kevin
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4735 - 4741
  • [18] RecipeGPT: Generative Pre-training Based Cooking Recipe Generation and Evaluation System
    Lee, Helena H.
    Shu, Ke
    Achananuparp, Palakorn
    Prasetyo, Philips Kokoh
    Liu, Yue
    Lim, Ee-Peng
    Varshney, Lav R.
    WWW'20: COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2020, 2020, : 181 - 184
  • [19] Self-supervised Pre-training of Text Recognizers
    Kiss, Martin
    Hradis, Michal
    DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT IV, 2024, 14807 : 218 - 235
  • [20] Image-Text Pre-Training for Logo Recognition
    Hubenthal, Mark
    Kumar, Suren
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 1145 - 1154