HARSAM: A Hybrid Model for Recommendation Supported by Self-Attention Mechanism

被引:15
|
作者
Peng, Dunlu [1 ]
Yuan, Weiwei [1 ]
Liu, Cong [1 ]
机构
[1] Univ Shanghai Sci & Technol, Sch Opt Elect & Comp Engn, Shanghai 20093, Peoples R China
基金
中国国家自然科学基金;
关键词
SDAE; self-attention mechanism; preference expression; recommendation system;
D O I
10.1109/ACCESS.2019.2892565
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Collaborative filtering is one of the most commonly used methods in recommendation systems. However, the sparsity of the rating matrix, cold start-up, and most recommendation algorithms only consider the users while neglecting the relationship between the products, all of what limit the effectiveness of the recommendation algorithms. In this paper, based on the self-attention mechanism, a deep learning model, named HARSAM, is proposed for modeling user interaction data and learning the user's latent preference expression. HARSAM partitions the user's latent feedback data in different time granularity and employs the self-attention mechanism to extract the correlation among the data in each partition. Moreover, the model learns the user's latent preferences through the deep neural network. Simultaneously, the model learns the item latent representation by making use of the stacked denoising autoencoder to model the item's rating data. As the result, the model recommends items to users according to the similarities between user's preference and items. Experiments conducted on the public data demonstrate the effectiveness of the proposed model.
引用
收藏
页码:12620 / 12629
页数:10
相关论文
共 50 条
  • [21] A prediction model of student performance based on self-attention mechanism
    Chen, Yan
    Wei, Ganglin
    Liu, Jiaxin
    Chen, Yunwei
    Zheng, Qinghua
    Tian, Feng
    Zhu, Haiping
    Wang, Qianying
    Wu, Yaqiang
    KNOWLEDGE AND INFORMATION SYSTEMS, 2023, 65 (02) : 733 - 758
  • [22] Dynamic Structured Neural Topic Model with Self-Attention Mechanism
    Miyamoto, Nozomu
    Isonuma, Masaru
    Takase, Sho
    Mori, Junichiro
    Sakata, Ichiro
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5916 - 5930
  • [23] Generation Model of Character Posture TransferBased on Self-attention Mechanism
    Zhao Ning
    Liu Libo
    LASER & OPTOELECTRONICS PROGRESS, 2022, 59 (04)
  • [24] A Text Sentiment Analysis Model Based on Self-Attention Mechanism
    Ji, Likun
    Gong, Ping
    Yao, Zhuyu
    2019 THE 3RD INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPILATION, COMPUTING AND COMMUNICATIONS (HP3C 2019), 2019, : 33 - 37
  • [25] A Bidirectional LSTM Spatiotemporal Interpolation Model with Self-attention Mechanism
    Zhou, Xiaoyu
    Wang, Haiqi
    Wang, Qiong
    Shan, Yufei
    Yan, Feng
    Li, Fadong
    Liu, Feng
    Cao, Yuanhao
    Ou, Yawen
    Li, Xueying
    Journal of Geo-Information Science, 2024, 26 (08) : 1827 - 1842
  • [26] Self-attention Based Collaborative Neural Network for Recommendation
    Ma, Shengchao
    Zhu, Jinghua
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2019, 2019, 11604 : 235 - 246
  • [27] Hashtag Recommendation Using LSTM Networks with Self-Attention
    Shen, Yatian
    Li, Yan
    Sun, Jun
    Ding, Wenke
    Shi, Xianjin
    Zhang, Lei
    Shen, Xiajiong
    He, Jing
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 61 (03): : 1261 - 1269
  • [28] Core Interests Focused Self-attention for Sequential Recommendation
    Ai, Zhengyang
    Wang, Shupeng
    Jia, Siyu
    Guo, Shu
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 306 - 314
  • [29] Spiking neural self-attention network for sequence recommendation
    Bai, Xinzhu
    Huang, Yanping
    Peng, Hong
    Yang, Qian
    Wang, Jun
    Liu, Zhicai
    APPLIED SOFT COMPUTING, 2025, 169
  • [30] Weight Adjustment Framework for Self-Attention Sequential Recommendation
    Su, Zheng-Ang
    Zhang, Juan
    APPLIED SCIENCES-BASEL, 2024, 14 (09):