In-depth Recommendation Model Based on Self-Attention Factorization

被引:3
|
作者
Ma, Hongshuang [1 ]
Liu, Qicheng [1 ]
机构
[1] Yantai Univ, Sch Comp & Control Engn, Yantai 264000, Shandong, Peoples R China
基金
中国国家自然科学基金;
关键词
Self-attention network; deep learning; recommendation model; review text;
D O I
10.3837/tiis.2023.03.003
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rating prediction is an important issue in recommender systems, and its accuracy affects the experience of the user and the revenue of the company. Traditional recommender systems use Factorization Machines for rating predictions and each feature is selected with the same weight. Thus, there are problems with inaccurate ratings and limited data representation. This study proposes a deep recommendation model based on self-attention Factorization (SAFMR) to solve these problems. This model uses Convolutional Neural Networks to extract features from user and item reviews. The obtained features are fed into self-attention mechanism Factorization Machines, where the self-attention network automatically learns the dependencies of the features and distinguishes the weights of the different features, thereby reducing the prediction error. The model was experimentally evaluated using six classes of dataset. We compared MSE, NDCG and time for several real datasets. The experiment demonstrated that the SAFMR model achieved excellent rating prediction results and recommendation correlations, thereby verifying the effectiveness of the model.
引用
收藏
页码:721 / 739
页数:19
相关论文
共 50 条
  • [31] CRAM: Code Recommendation With Programming Context Based on Self-Attention Mechanism
    Tao, Chuanqi
    Lin, Kai
    Huang, Zhiqiu
    Sun, Xiaobing
    IEEE TRANSACTIONS ON RELIABILITY, 2023, 72 (01) : 302 - 316
  • [32] Graph Contextualized Self-Attention Network for Session-based Recommendation
    Xu, Chengfeng
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhuang, Fuzhen
    Fang, Junhua
    Zhou, Xiaofang
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3940 - 3946
  • [33] Design Resources Recommendation Based on Word Vectors and Self-Attention Mechanisms
    Sun Q.
    Deng C.
    Gu Z.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2024, 36 (01): : 63 - 72
  • [34] Probabilistic Matrix Factorization Recommendation of Self-Attention Mechanism Convolutional Neural Networks With Item Auxiliary Information
    Zhang, Chenkun
    Wang, Cheng
    IEEE ACCESS, 2020, 8 (08): : 208311 - 208321
  • [35] Limits to Depth-Efficiencies of Self-Attention
    Levine, Yoav
    Wies, Noam
    Sharir, Or
    Bata, Hofit
    Shashua, Amnon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [36] A self-attention model with contrastive learning for online group recommendation in event-based social networks
    Zhou, Zhiheng
    Huang, Xiaomei
    Xiong, Naixue
    Liao, Guoqiong
    Deng, Xiaobin
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (07): : 9713 - 9741
  • [37] Hashtag Recommendation Using LSTM Networks with Self-Attention
    Shen, Yatian
    Li, Yan
    Sun, Jun
    Ding, Wenke
    Shi, Xianjin
    Zhang, Lei
    Shen, Xiajiong
    He, Jing
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 61 (03): : 1261 - 1269
  • [38] A self-attention model with contrastive learning for online group recommendation in event-based social networks
    Zhiheng Zhou
    Xiaomei Huang
    Naixue Xiong
    Guoqiong Liao
    Xiaobin Deng
    The Journal of Supercomputing, 2024, 80 : 9713 - 9741
  • [39] Core Interests Focused Self-attention for Sequential Recommendation
    Ai, Zhengyang
    Wang, Shupeng
    Jia, Siyu
    Guo, Shu
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 306 - 314
  • [40] Spiking neural self-attention network for sequence recommendation
    Bai, Xinzhu
    Huang, Yanping
    Peng, Hong
    Yang, Qian
    Wang, Jun
    Liu, Zhicai
    APPLIED SOFT COMPUTING, 2025, 169