SanMove: next location recommendation via self-attention network

被引:1
|
作者
Wang, Bin [1 ]
Li, Huifeng [1 ]
Tong, Le [1 ]
Zhang, Qian [1 ]
Zhu, Sulei [1 ]
Yang, Tao [2 ]
机构
[1] Shanghai Normal Univ, Shanghai, Peoples R China
[2] Shanghai Urban & Rural Construct & Traff Dev Res I, Shanghai, Peoples R China
关键词
Next location prediction; Self-attention network; Auxiliary information; PREDICTION;
D O I
10.1108/DTA-03-2022-0093
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
PurposeThis paper aims to address the following issues: (1) most existing methods are based on recurrent network, which is time-consuming to train long sequences due to not allowing for full parallelism; (2) personalized preference generally are not considered reasonably; (3) existing methods rarely systematically studied how to efficiently utilize various auxiliary information (e.g. user ID and time stamp) in trajectory data and the spatiotemporal relations among nonconsecutive locations.Design/methodology/approachThe authors propose a novel self-attention network-based model named SanMove to predict the next location via capturing the long- and short-term mobility patterns of users. Specifically, SanMove uses a self-attention module to capture each user's long-term preference, which can represent her personalized location preference. Meanwhile, the authors use a spatial-temporal guided noninvasive self-attention (STNOVA) module to exploit auxiliary information in the trajectory data to learn the user's short-term preference.FindingsThe authors evaluate SanMove on two real-world datasets. The experimental results demonstrate that SanMove is not only faster than the state-of-the-art recurrent neural network (RNN) based predict model but also outperforms the baselines for next location prediction.Originality/valueThe authors propose a self-attention-based sequential model named SanMove to predict the user's trajectory, which comprised long-term and short-term preference learning modules. SanMove allows full parallel processing of trajectories to improve processing efficiency. They propose an STNOVA module to capture the sequential transitions of current trajectories. Moreover, the self-attention module is used to process historical trajectory sequences in order to capture the personalized location preference of each user. The authors conduct extensive experiments on two check-in datasets. The experimental results demonstrate that the model has a fast training speed and excellent performance compared with the existing RNN-based methods for next location prediction.
引用
收藏
页码:330 / 343
页数:14
相关论文
共 50 条
  • [31] Individualized tourism recommendation based on self-attention
    Liu, Guangjie
    Ma, Xin
    Zhu, Jinlong
    Zhang, Yu
    Yang, Danyang
    Wang, Jianfeng
    Wang, Yi
    PLOS ONE, 2022, 17 (08):
  • [32] AUBRec: adaptive augmented self-attention via user behaviors for sequential recommendation
    Fan, Jin
    Yu, Xiaofeng
    Wang, Zehao
    Wang, Weijie
    Sun, Danfeng
    Wu, Huifeng
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (24): : 21715 - 21728
  • [33] Sequential Recommendation via Temporal Self-Attention and Multi-Preference Learning
    Wang, Wenchao
    Zhu, Jinghua
    Xi, Heran
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT II, 2021, 12938 : 18 - 30
  • [34] CGSNet: Contrastive Graph Self-Attention Network for Session-based Recommendation
    Wang, Fuyun
    Lu, Xuequan
    Lyu, Lei
    KNOWLEDGE-BASED SYSTEMS, 2022, 251
  • [35] AUBRec: adaptive augmented self-attention via user behaviors for sequential recommendation
    Jin Fan
    Xiaofeng Yu
    Zehao Wang
    Weijie Wang
    Danfeng Sun
    Huifeng Wu
    Neural Computing and Applications, 2022, 34 : 21715 - 21728
  • [36] A Dual-View Knowledge Enhancing Self-Attention Network for Sequential Recommendation
    Tang, Hao
    Zhang, Feng
    Xu, Xinhai
    Zhang, Jieyuan
    Liu, Donghong
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 832 - 839
  • [37] Long- and short-term self-attention network for sequential recommendation
    Xu, Chengfeng
    Feng, Jian
    Zhao, Pengpeng
    Zhuang, Fuzhen
    Wang, Deqing
    Liu, Yanchi
    Sheng, Victor S.
    NEUROCOMPUTING, 2021, 423 : 580 - 589
  • [38] Self-Attention Network for Session-Based Recommendation With Streaming Data Input
    Sun, Shiming
    Tang, Yuanhe
    Dai, Zemei
    Zhou, Fu
    IEEE ACCESS, 2019, 7 : 110499 - 110509
  • [39] Attributed Heterogeneous Information Network Embedding with Self-Attention Mechanism for Product Recommendation
    Wang H.
    Yang D.
    Nie T.
    Kou Y.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2022, 59 (07): : 1509 - 1521
  • [40] The function of the self-attention network
    Cunningham, Sheila J.
    COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 21 - 22