A Transformer-Based Substitute Recommendation Model IncorporatingWeakly Supervised Customer Behavior Data

被引:0
|
作者
Ye, Wenting [1 ]
Yang, Hongfei [1 ]
Zhao, Shuai [1 ]
Fang, Haoyang [1 ]
Shi, Xingjian [2 ]
Neppalli, Naveen [1 ]
机构
[1] Amazon Retails, Seattle, WA 98109 USA
[2] AWS AI, Santa Clara, CA USA
关键词
substitute recommendation; multilingual; weakly supervised learning; natural language processing; selection bias; implicit feedback;
D O I
10.1145/3539618.3591847
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The substitute-based recommendation is widely used in E-commerce to provide better alternatives to customers. However, existing research typically uses customer behavior signals like co-view and view-but-purchase-another to capture the substitute relationship. Despite its intuitive soundness, such an approach might ignore the functionality and characteristics of products. In this paper, we adapt substitute recommendations into language matching problem. It takes the product title description as model input to consider product functionality. We design a new transformation method to de-noise the signals derived from production data. In addition, we consider multilingual support from the engineering point of view. Our proposed end-to-end transformer-based model achieves both successes from offline and online experiments. The proposed model has been deployed in a large-scale E-commerce website for 11 marketplaces in 6 languages. Our proposed model is demonstrated to increase revenue by 19% based on an online A/B experiment.
引用
收藏
页码:3325 / 3329
页数:5
相关论文
共 50 条
  • [21] A Transformer-based self-supervised learning model for fault diagnosis of air-conditioning systems with limited labeled data
    Hua, Mei
    Yan, Ke
    Li, Xin
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 146
  • [22] Designing a Movie Recommendation System Through a Transformer-Based Embeddings Space
    Iglesias, Oscar I. R.
    Pardo, Carlos E. B.
    Lopez, Jose Onate
    Quintero, Christian G. M.
    2024 IEEE COLOMBIAN CONFERENCE ON COMMUNICATIONS AND COMPUTING, COLCOM 2024, 2024,
  • [23] Transformer-Based Self-Supervised Monocular Depth and Visual Odometry
    Zhao, Hongru
    Qiao, Xiuquan
    Ma, Yi
    Tafazolli, Rahim
    IEEE SENSORS JOURNAL, 2023, 23 (02) : 1436 - 1446
  • [24] Multimodal Emotion Recognition With Transformer-Based Self Supervised Feature Fusion
    Siriwardhana, Shamane
    Kaluarachchi, Tharindu
    Billinghurst, Mark
    Nanayakkara, Suranga
    IEEE ACCESS, 2020, 8 (08): : 176274 - 176285
  • [25] Empowering Retail Dual Transformer-Based Profound Product Recommendation Using Multi-Model Review
    Alsekait, Deema Mohammed
    Nawaz, Asif
    Fathi, Hanaa
    Ahmed, Zohair
    Taha, Mohamed
    Alshinwan, Mohammad
    Taha, Ahmed
    Issa, Mohamed F.
    Nabil, Ayman
    AbdElminaam, Diaa Salama
    JOURNAL OF ORGANIZATIONAL AND END USER COMPUTING, 2024, 36 (01)
  • [26] Personality BERT: A Transformer-Based Model for Personality Detection from Textual Data
    Jain, Dipika
    Kumar, Akshi
    Beniwal, Rohit
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON COMPUTING AND COMMUNICATION NETWORKS (ICCCN 2021), 2022, 394 : 515 - 522
  • [27] Vision Transformer-Based Photovoltaic Prediction Model
    Kang, Zaohui
    Xue, Jizhong
    Lai, Chun Sing
    Wang, Yu
    Yuan, Haoliang
    Xu, Fangyuan
    ENERGIES, 2023, 16 (12)
  • [28] Transformer-Based Model for Electrical Load Forecasting
    L'Heureux, Alexandra
    Grolinger, Katarina
    Capretz, Miriam A. M.
    ENERGIES, 2022, 15 (14)
  • [29] Transformer-Based Model for Auditory EEG Decoding
    Chen, Jiaxin
    Liu, Yin-Long
    Feng, Rui
    Yuan, Jiahong
    Ling, Zhen-Hua
    MAN-MACHINE SPEECH COMMUNICATION, NCMMSC 2024, 2025, 2312 : 129 - 143
  • [30] A superior image inpainting scheme using Transformer-based self-supervised attention GAN model
    Zhou, Meili
    Liu, Xiangzhen
    Yi, Tingting
    Bai, Zongwen
    Zhang, Pei
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 233