Learning Compact Compositional Fmbeddings via Regularized Pruning for Recommendation

被引:2
|
作者
Liang, Xurong [1 ]
Chen, Tong [1 ]
Quoc Viet Hung Nguyen [2 ]
Li, Jianxin [3 ]
Yin, Hongzhi [1 ]
机构
[1] Univ Queensland, Brisbane, Qld, Australia
[2] Griffith Univ, Nathan, Qld, Australia
[3] Deakin Univ, Geelong, Vic, Australia
基金
澳大利亚研究理事会;
关键词
lightweight recommender systems; compositional embeddings; regularized pruning;
D O I
10.1109/ICDM58522.2023.00047
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Latent factor models arc the dominant backbones of contemporary recommender systems (RSs) given their performance advantages, where a unique vector embedding with a fixed dimensionality (e.g., 128) is required to represent each entity (commonly a user/item). Due to the large number of users and items on e -commerce sites, the embedding table is arguably the least memory -efficient component of RSs. For any lightweight recommender that aims to efficiently scale with the growing size of users/items or to remain applicable in resourceconstrained settings, existing solutions either reduce the number of embeddings needed via hashing, or sparsify the full embedding table to switch off selected embedding dimensions. However, as hash collision arises or embeddings become overly sparse, especially when adapting to a tighter memory budget, those lightweight reconunenders inevitably have to compromise their accuracy. To this end, we propose a novel compact embedding framework for RSs, namely Compositional Embedding with Regularized Pruning (CERP). Specifically, CERP represents each entity by combining a pair of embeddings from two independent, substantially smaller meta -embedding tables, which are then jointly pruned via a learnable element -wise threshold. In addition, we innovatively design a regularized pruning mechanism in CERP, such that the two sparsilied meta -embedding tables are encouraged to encode information that is mutually complementary. Given the compatibility with agnostic latent factor models, we pair CERP with two popular recommendation models for extensive experiments, where results on two realworld datasets under different memory budgets demonstrate its superiority against state-of-the-art baselines. The codebase of CERP is available in https://github.com/xurong-liang/CERP.
引用
收藏
页码:378 / 387
页数:10
相关论文
共 50 条
  • [1] Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning
    Lin, Shaohui
    Ji, Rongrong
    Li, Yuchao
    Deng, Cheng
    Li, Xuelong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (02) : 574 - 588
  • [2] Learning Compact Networks via Similarity-aware Channel Pruning
    Zhang, Quan
    Shi, Yemin
    Zhang, Lechun
    Wang, Yaowei
    Tian, Yonghong
    THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 149 - 152
  • [3] Double Sparse Deep Reinforcement Learning via Multilayer Sparse Coding and Nonconvex Regularized Pruning
    Zhao, Haoli
    Wu, Jiqiang
    Li, Zhenni
    Chen, Wuhui
    Zheng, Zibin
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (02) : 765 - 778
  • [4] An Optimal Approach For Pruning Annular Regularized Extreme Learning Machines
    Singh, Lavneet
    Chetty, Girija
    2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOP (ICDMW), 2014, : 80 - 87
  • [5] Efficient itinerary recommendation via personalized POI selection and pruning
    Sajal Halder
    Kwan Hui Lim
    Jeffrey Chan
    Xiuzhen Zhang
    Knowledge and Information Systems, 2022, 64 : 963 - 993
  • [6] Efficient itinerary recommendation via personalized POI selection and pruning
    Halder, Sajal
    Lim, Kwan Hui
    Chan, Jeffrey
    Zhang, Xiuzhen
    KNOWLEDGE AND INFORMATION SYSTEMS, 2022, 64 (04) : 963 - 993
  • [7] BinGAN: Learning Compact Binary Descriptors with a Regularized GAN
    Zieba, Maciej
    Semberecki, Piotr
    El-Gaaly, Tarek
    Trzcinski, Tomasz
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] Regularized Dynamic Boltzmann Machine with Delay Pruning for Unsupervised Learning of Temporal Sequences
    Dasgupta, Sakyasingha
    Yoshizumi, Takayuki
    Osogami, Takayuki
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 1201 - 1206
  • [9] CCPrune: Collaborative channel pruning for learning compact convolutional networks
    Chen, Yanming
    Wen, Xiang
    Zhang, Yiwen
    Shi, Weisong
    Neurocomputing, 2021, 451 : 35 - 45
  • [10] CCPrune: Collaborative channel pruning for learning compact convolutional networks
    Chen, Yanming
    Wen, Xiang
    Zhang, Yiwen
    Shi, Weisong
    NEUROCOMPUTING, 2021, 451 : 35 - 45