HARSAM: A Hybrid Model for Recommendation Supported by Self-Attention Mechanism

被引:15
|
作者
Peng, Dunlu [1 ]
Yuan, Weiwei [1 ]
Liu, Cong [1 ]
机构
[1] Univ Shanghai Sci & Technol, Sch Opt Elect & Comp Engn, Shanghai 20093, Peoples R China
基金
中国国家自然科学基金;
关键词
SDAE; self-attention mechanism; preference expression; recommendation system;
D O I
10.1109/ACCESS.2019.2892565
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Collaborative filtering is one of the most commonly used methods in recommendation systems. However, the sparsity of the rating matrix, cold start-up, and most recommendation algorithms only consider the users while neglecting the relationship between the products, all of what limit the effectiveness of the recommendation algorithms. In this paper, based on the self-attention mechanism, a deep learning model, named HARSAM, is proposed for modeling user interaction data and learning the user's latent preference expression. HARSAM partitions the user's latent feedback data in different time granularity and employs the self-attention mechanism to extract the correlation among the data in each partition. Moreover, the model learns the user's latent preferences through the deep neural network. Simultaneously, the model learns the item latent representation by making use of the stacked denoising autoencoder to model the item's rating data. As the result, the model recommends items to users according to the similarities between user's preference and items. Experiments conducted on the public data demonstrate the effectiveness of the proposed model.
引用
收藏
页码:12620 / 12629
页数:10
相关论文
共 50 条
  • [31] Time Interval Aware Self-Attention for Sequential Recommendation
    Li, Jiacheng
    Wang, Yujie
    McAuley, Julian
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 322 - 330
  • [32] Exception Handling Recommendation Based on Self-Attention Network
    Lin, Kai
    Tao, Chuanqi
    Huang, Zhiqiu
    2021 IEEE INTERNATIONAL SYMPOSIUM ON SOFTWARE RELIABILITY ENGINEERING WORKSHOPS (ISSREW 2021), 2021, : 282 - 283
  • [33] A time-aware self-attention based neural network model for sequential recommendation
    Zhang, Yihu
    Yang, Bo
    Liu, Haodong
    Li, Dongsheng
    APPLIED SOFT COMPUTING, 2023, 133
  • [34] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu S.
    Wei J.
    Liu G.
    Zhou B.
    PeerJ Computer Science, 2023, 9
  • [35] HTNet: A Hybrid Model Boosted by Triple Self-attention for Crowd Counting
    Li, Yang
    Yin, Baoqun
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT XII, 2024, 14436 : 290 - 301
  • [36] Self-Attention and Dynamic Convolution Hybrid Model for Neural Machine Translation
    Zhang, Zhebin
    Wu, Sai
    Chen, Gang
    Jiang, Dawei
    11TH IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG 2020), 2020, : 352 - 359
  • [37] Multi-head Self-attention Recommendation Model based on Feature Interaction Enhancement
    Yin, Yunfei
    Huang, Caihao
    Sun, Jingqin
    Huang, Faliang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 1740 - 1745
  • [38] A personalized paper recommendation method based on knowledge graph and transformer encoder with a self-attention mechanism
    Gao, Li
    Lan, Yu
    Yu, Zhen
    Zhu, Jian-min
    APPLIED INTELLIGENCE, 2023, 53 (24) : 29991 - 30008
  • [39] A personalized paper recommendation method based on knowledge graph and transformer encoder with a self-attention mechanism
    Li Gao
    Yu Lan
    Zhen Yu
    Jian-min Zhu
    Applied Intelligence, 2023, 53 : 29991 - 30008
  • [40] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu, Siqi
    Wei, Jiangshu
    Liu, Gang
    Zhou, Bei
    PEERJ COMPUTER SCIENCE, 2023, 9