MulAttenRec: A Multi-level Attention-Based Model for Recommendation

被引:2
|
作者
Lin, Zhipeng [1 ]
Yang, Wenjing [1 ]
Zhang, Yongjun [2 ]
Wang, Haotian [1 ]
Tang, Yuhua [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, State Key Lab High Performance Comp, Changsha, Hunan, Peoples R China
[2] Natl Innovat Inst Def Technol, Beijing, Peoples R China
基金
美国国家科学基金会;
关键词
Recommender systems; Attention-based mechanism; Convolutional neural network; Factorization machine;
D O I
10.1007/978-3-030-04179-3_21
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It is common nowadays for online buyers to rate shopping items and write review text. This review text information has been proven to be very useful in understanding user preferences and item properties, and thus enhances the capability of Recommender Systems (RS). However, the usefulness of reviews and the significance of words in each review are varied. In this paper, we introduce a multi-level attention mechanism to explore the usefulness of reviews and the significance of words and propose a Multi-level Attention-based Model (MulAttRec) for the recommendation. In addition, we introduce a hybrid prediction layer that model the non-linear interaction between users and items by coupling Factorization Machine (FM) to Deep Neural Network (DNN), which emphasizes both low-order and high-order feature interaction. Extensive experiments show that our approach is able to provide more accurate recommendations than the state-of-the-art recommendation approaches including PMF, NMF, LDA, DeepCoNN, and NARRE. Furthermore, the visualization and analysis of keyword and useful reviews validate the reasonability of our multi-level attention mechanism.
引用
收藏
页码:240 / 252
页数:13
相关论文
共 50 条
  • [21] Attention-based context-aware sequential recommendation model
    Yuan, Weihua
    Wang, Hong
    Yu, Xiaomei
    Liu, Nan
    Li, Zhenghao
    INFORMATION SCIENCES, 2020, 510 : 122 - 134
  • [22] A Multi-Level Attention Model for Evidence-Based Fact Checking
    Kruengkrai, Canasai
    Yamagishi, Junichi
    Wang, Xin
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2447 - 2460
  • [23] Attention-Based Neural Tag Recommendation
    Yuan, Jiahao
    Jin, Yuanyuan
    Liu, Wenyan
    Wang, Xiaoling
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2019), PT II, 2019, 11447 : 350 - 365
  • [24] Attention-based multi attribute matrix factorization for enhanced recommendation performance
    Jang, Dongsoo
    Li, Qinglong
    Lee, Chaeyoung
    Kim, Jaekyeong
    INFORMATION SYSTEMS, 2024, 121
  • [25] Detecting Personal Medication Intake in Twitter via Domain Attention-Based RNN with Multi-Level Features
    Xiong, Shufeng
    Batra, Vishwash
    Liu, Liangliang
    Xi, Lei
    Sun, Changxia
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [26] CrePoster: Leveraging multi-level features for cultural relic poster generation via attention-based framework
    Zhang, Mohan
    Liu, Fang
    Li, Biyao
    Liu, Zhixiong
    Ma, Wentao
    Ran, Changjuan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 245
  • [27] Next Point-of-Interest Recommendation with Temporal and Multi-level Context Attention
    Li, Ranzhen
    Shen, Yanyan
    Zhu, Yanmin
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 1110 - 1115
  • [28] An image inpainting model based on channel attention gated convolution and multi-level attention mechanism
    Zhao, Sihan
    Li, Chunmeng
    Zhang, Chenyang
    Yang, Xiaozhong
    DISPLAYS, 2025, 87
  • [29] AMNN: Attention-Based Multimodal Neural Network Model for Hashtag Recommendation
    Yang, Qi
    Wu, Gaosheng
    Li, Yuhua
    Li, Ruixuan
    Gu, Xiwu
    Deng, Huicai
    Wu, Junzhuang
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2020, 7 (03) : 768 - 779
  • [30] Road Crack Model Based on Multi-Level Feature Fusion and Attention Mechanism
    Song, Rongrong
    Wang, Caiyong
    Tian, Qichuan
    Zhang, Qi
    Computer Engineering and Applications, 2023, 59 (13): : 281 - 288