Empowering Retail Dual Transformer-Based Profound Product Recommendation Using Multi-Model Review

被引:0
|
作者
Alsekait, Deema Mohammed [1 ]
Nawaz, Asif [2 ]
Fathi, Hanaa [3 ]
Ahmed, Zohair [4 ]
Taha, Mohamed [5 ]
Alshinwan, Mohammad [6 ]
Taha, Ahmed [5 ]
Issa, Mohamed F.
Nabil, Ayman [7 ]
AbdElminaam, Diaa Salama [8 ]
机构
[1] Princess Nourah bint Abdulrahman Univ, Alriyad, Saudi Arabia
[2] Arid Agr Univ, PMAS, Rawalpindi, Pakistan
[3] Appl Sci Private Univ, Amman, Jordan
[4] Islamic Univ, Islamabad, Pakistan
[5] Benha Univ, Banha, Egypt
[6] Univ Pannonia, Pannonia, Hungary
[7] Misr Int Univ, Cairo, Egypt
[8] Jadara Res Ctr, Cairo, Egypt
关键词
Multus-Medium Reviews; Recommendation; Sentiment Score; SpanBERT; Fusion; Vti Transformer; SENTIMENT ANALYSIS; MODEL;
D O I
10.4018/JOEUC.358002
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Advancements in technology have significantly changed how we interact on social media platforms, where reviews and comments heavily influence consumer decisions. Traditionally, opinion mining has focused on textual data, overlooking the valuable insights present in customer-uploaded images-a concept we term Multus-Medium. This paper introduces a multimodal strategy for product recommendations that utilizes both text and image data. The proposed approach involves data collection, preprocessing, and sentiment analysis using Vti for images and SpanBERT for text reviews. These outputs are then fused to generate a final recommendation. The proposed model demonstrates superior performance, achieving 91.55% accuracy on the Amazon dataset and 90.89% on the Kaggle dataset. These compelling findings underscore the potential of our approach, offering a comprehensive and precise method for opinion mining in the era of social media-driven product reviews, ultimately aiding consumers in making informed purchasing decisions.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] Multi-step-ahead and interval carbon price forecasting using transformer-based hybrid model
    Wang, Yue
    Wang, Zhong
    Wang, Xiaoyi
    Kang, Xinyu
    ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH, 2023, 30 (42) : 95692 - 95719
  • [22] Fake review detection using transformer-based enhanced LSTM and RoBERTa
    Mohawesh R.
    Bany Salameh H.
    Jararweh Y.
    Alkhalaileh M.
    Maqsood S.
    International Journal of Cognitive Computing in Engineering, 2024, 5 : 250 - 258
  • [23] TransMF: Transformer-Based Multi-Scale Fusion Model for Crack Detection
    Ju, Xiaochen
    Zhao, Xinxin
    Qian, Shengsheng
    MATHEMATICS, 2022, 10 (13)
  • [24] Multi-Modal Pedestrian Crossing Intention Prediction with Transformer-Based Model
    Wang, Ting-Wei
    Lai, Shang-Hong
    APSIPA TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING, 2024, 13 (05)
  • [25] Pedestrian Crossing Intention Prediction with Multi-Modal Transformer-Based Model
    Wang, Ting Wei
    Lai, Shang-Hong
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 1349 - 1356
  • [26] Transformer-Based Intelligent Prediction Model for Multimodal Multi-Objective Optimization
    Dang, Qianlong
    Zhang, Guanghui
    Wang, Ling
    Yu, Yang
    Yang, Shuai
    He, Xiaoyu
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2025, 20 (01) : 34 - 49
  • [27] Transformer Based Multi-model Fusion for 3D Facial Animation
    Chen, Benwang
    Luo, Chunshui
    Wang, Haoqian
    2023 2ND CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, CFASTA, 2023, : 659 - 663
  • [28] A multi-task dual attention deep recommendation model using ratings and review helpfulness
    Zhen Liu
    Baoxin Yuan
    Ying Ma
    Applied Intelligence, 2022, 52 : 5595 - 5607
  • [29] A multi-task dual attention deep recommendation model using ratings and review helpfulness
    Liu, Zhen
    Yuan, Baoxin
    Ma, Ying
    APPLIED INTELLIGENCE, 2022, 52 (05) : 5595 - 5607
  • [30] Object detection using convolutional neural networks and transformer-based models: a review
    Shrishti Shah
    Jitendra Tembhurne
    Journal of Electrical Systems and Information Technology, 10 (1)