A Self-Attention Mask Learning-Based Recommendation System

被引:5
|
作者
Aljohani, Abeer [1 ]
Rakrouki, Mohamed Ali [1 ,2 ,3 ]
Alharbe, Nawaf [1 ]
Alluhaibi, Reyadh [4 ]
机构
[1] Taibah Univ, Appl Coll, Madinah 1089, Saudi Arabia
[2] Univ Tunis, Ecole Super Sci Econ & Commerciales Tunis, Tunis 1938, Tunisia
[3] Univ Tunis, Tunis Business Sch, Business Analyt & Decis Making Lab BADEM, Tunis 1938, Tunisia
[4] Taibah Univ, Coll Comp Sci & Engn, Dept Comp Sci, Madinah 1089, Saudi Arabia
关键词
Transformers; Biological system modeling; Behavioral sciences; Logic gates; Recommender systems; Encoding; Machine learning; Recommendation algorithm; machine learning; sequence recommendation model;
D O I
10.1109/ACCESS.2022.3202637
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The primary purpose of sequence modeling is to record long-term interdependence across interaction sequences, and since the number of items purchased by users gradually increases over time, this brings challenges to sequence modeling to a certain extent. Relationships between terms are often overlooked, and it is crucial to build sequential models that effectively capture long-term dependencies. Existing methods focus on extracting global sequential information, while ignoring deep representations from subsequences. We argue that limited item transfer is fundamental to sequence modeling, and that partial substructures of sequences can help models learn more efficient long-term dependencies compared to entire sequences. This paper proposes a sequence recommendation model named GAT4Rec (Gated Recurrent Unit And Transformer For Recommendation), which uses a Transformer layer that shares parameters across layers to model the user's historical interaction sequence. The representation learned by the gated recurrent unit is used as a gating signal to filter out better substructures of the user sequence. The experimental results demonstrate that our proposed GAT4Rec model is superior to other models and has a higher recommendation effectiveness.
引用
收藏
页码:93017 / 93028
页数:12
相关论文
共 50 条
  • [31] Graph contextualized self-attention network for session-based recommendation
    Xu, Chengfeng
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhuang, Fuzhen
    Fang, Junhua
    Zhou, Xiaofang
    IJCAI International Joint Conference on Artificial Intelligence, 2019, 2019-August : 3940 - 3946
  • [32] CRAM: Code Recommendation With Programming Context Based on Self-Attention Mechanism
    Tao, Chuanqi
    Lin, Kai
    Huang, Zhiqiu
    Sun, Xiaobing
    IEEE TRANSACTIONS ON RELIABILITY, 2023, 72 (01) : 302 - 316
  • [33] Graph Contextualized Self-Attention Network for Session-based Recommendation
    Xu, Chengfeng
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhuang, Fuzhen
    Fang, Junhua
    Zhou, Xiaofang
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3940 - 3946
  • [34] Design Resources Recommendation Based on Word Vectors and Self-Attention Mechanisms
    Sun Q.
    Deng C.
    Gu Z.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2024, 36 (01): : 63 - 72
  • [35] EXPLOITING OPTIMAL SELF-ATTENTION DEEP LEARNING-BASED RECOGNITION OF TEXTUAL EMOTIONS FOR DISABLED PERSONS
    Alshahrani, Haya Mesfer
    Yaseen, Ishfaq
    Drar, Suhanda
    FRACTALS-COMPLEX GEOMETRY PATTERNS AND SCALING IN NATURE AND SOCIETY, 2024, 32 (09N10)
  • [36] A self-attention model with contrastive learning for online group recommendation in event-based social networks
    Zhou, Zhiheng
    Huang, Xiaomei
    Xiong, Naixue
    Liao, Guoqiong
    Deng, Xiaobin
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (07): : 9713 - 9741
  • [37] A self-attention model with contrastive learning for online group recommendation in event-based social networks
    Zhiheng Zhou
    Xiaomei Huang
    Naixue Xiong
    Guoqiong Liao
    Xiaobin Deng
    The Journal of Supercomputing, 2024, 80 : 9713 - 9741
  • [38] Hashtag Recommendation Using LSTM Networks with Self-Attention
    Shen, Yatian
    Li, Yan
    Sun, Jun
    Ding, Wenke
    Shi, Xianjin
    Zhang, Lei
    Shen, Xiajiong
    He, Jing
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 61 (03): : 1261 - 1269
  • [39] Core Interests Focused Self-attention for Sequential Recommendation
    Ai, Zhengyang
    Wang, Shupeng
    Jia, Siyu
    Guo, Shu
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 306 - 314
  • [40] Spiking neural self-attention network for sequence recommendation
    Bai, Xinzhu
    Huang, Yanping
    Peng, Hong
    Yang, Qian
    Wang, Jun
    Liu, Zhicai
    APPLIED SOFT COMPUTING, 2025, 169