Attributed network embedding based on self-attention mechanism for recommendation method

被引:0
|
作者
Wang, Shuo [1 ]
Yang, Jing [1 ]
Shang, Fanshu [1 ]
机构
[1] Harbin Engn Univ, 145 Nangang Dist, Harbin 150000, Heilongjiang, Peoples R China
关键词
D O I
10.1038/s41598-023-44696-1
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Network embedding is a technique used to learn a low-dimensional vector representation for each node in a network. This method has been proven effective in network mining tasks, especially in the area of recommendation systems. The real-world scenarios often contain rich attribute information that can be leveraged to enhance the performance of representation learning methods. Therefore, this article proposes an attribute network embedding recommendation method based on self-attention mechanism (AESR) that caters to the recommendation needs of users with little or no explicit feedback data. The proposed AESR method first models the attribute combination representation of items and then uses a self-attention mechanism to compactly embed the combination representation. By representing users as different anchor vectors, the method can efficiently learn their preferences and reconstruct them with few learning samples. This achieves accurate and fast recommendations and avoids data sparsity problems. Experimental results show that AESR can provide personalized recommendations even for users with little explicit feedback information. Moreover, the attribute extraction of documents can effectively improve recommendation accuracy on different datasets. Overall, the proposed AESR method provides a promising approach to recommendation systems that can leverage attribute information for better performance.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] CGSNet: Contrastive Graph Self-Attention Network for Session-based Recommendation
    Wang, Fuyun
    Lu, Xuequan
    Lyu, Lei
    KNOWLEDGE-BASED SYSTEMS, 2022, 251
  • [32] Self-Attention Network for Session-Based Recommendation With Streaming Data Input
    Sun, Shiming
    Tang, Yuanhe
    Dai, Zemei
    Zhou, Fu
    IEEE ACCESS, 2019, 7 : 110499 - 110509
  • [33] Research on a Capsule Network Text Classification Method with a Self-Attention Mechanism
    Yu, Xiaodong
    Luo, Shun-Nain
    Wu, Yujia
    Cai, Zhufei
    Kuan, Ta-Wen
    Tseng, Shih-Pang
    SYMMETRY-BASEL, 2024, 16 (05):
  • [34] HARSAM: A Hybrid Model for Recommendation Supported by Self-Attention Mechanism
    Peng, Dunlu
    Yuan, Weiwei
    Liu, Cong
    IEEE ACCESS, 2019, 7 : 12620 - 12629
  • [35] SanMove: next location recommendation via self-attention network
    Wang, Bin
    Li, Huifeng
    Tong, Le
    Zhang, Qian
    Zhu, Sulei
    Yang, Tao
    DATA TECHNOLOGIES AND APPLICATIONS, 2023, 57 (03) : 330 - 343
  • [36] CSAN: Contextual Self-Attention Network for User Sequential Recommendation
    Huang, Xiaowen
    Qian, Shengsheng
    Fang, Quan
    Sang, Jitao
    Xu, Changsheng
    PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 447 - 455
  • [37] Feature Interaction Dual Self-attention network for sequential recommendation
    Zhu, Yunfeng
    Yao, Shuchun
    Sun, Xun
    FRONTIERS IN NEUROROBOTICS, 2024, 18
  • [38] Crowd counting method based on the self-attention residual network
    Liu, Yan-Bo
    Jia, Rui-Sheng
    Liu, Qing-Ming
    Zhang, Xing-Li
    Sun, Hong-Mei
    APPLIED INTELLIGENCE, 2021, 51 (01) : 427 - 440
  • [39] Dynamic Network Embedding in Hyperbolic Space via Self-attention
    Duan, Dingyang
    Zha, Daren
    Yang, Xiao
    Mu, Nan
    Shen, Jiahui
    WEB ENGINEERING (ICWE 2022), 2022, 13362 : 189 - 203
  • [40] Modeling Periodic Pattern with Self-Attention Network for Sequential Recommendation
    Ma, Jun
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhao, Lei
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2020), PT III, 2020, 12114 : 557 - 572