Multi-Source Domain Adaptation Enhanced Warehouse Dwell Time Prediction

被引:0
|
作者
Zhao, Wei [1 ]
Mao, Jiali [1 ]
Lv, Xingyi [1 ]
Jin, Cheqing [1 ]
Zhou, Aoying [1 ]
机构
[1] East China Normal Univ, Sch Data Sci & Engn, Shanghai 200062, Peoples R China
关键词
Attention; bulk logistics; queuing system; transfer learning; QUEUE;
D O I
10.1109/TKDE.2023.3324656
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Warehouse dwell time (WDT) of a truck is a critical metric for evaluating plant-logistics efficiency, including the time of the truck's queuing outside and loading inside the warehouse. But WDT prediction is challenging as it is affected by diverse factors like loading distinct types and weights of the cargoes, and varying amounts of loading tasks in different time slots. Besides, each trucks' WDT is transitively influenced by its preceding trucks' loading time in the queue. In this paper, we propose a multi-block dwell time prediction framework consisting of LSTM model and self-attention mechanism, called SDP. In view of that low performance of SDP brought by sparse loading data of some warehouses, we further design a multi-source adaptation based block-to-block transfer learning module. We present a warehouse similarity measurement based on loading tasks allocated and loading ability of the warehouses, according to which we enhance overall prediction performance by learning from high-performance WDT prediction models of similar warehouses. Experimental results on a large-scale logistics data set demonstrate that our proposal can reduce Mean Absolute Percentage Error (MAPE) by an average of 10.0%, Mean Absolute Error(MAE) by an average of 16.5%, and Root Mean Square Error(RMSE) by an average of 17.0% as compared to the baselines.
引用
收藏
页码:2533 / 2547
页数:15
相关论文
共 50 条
  • [21] Graphical Modeling for Multi-Source Domain Adaptation
    Xu, Minghao
    Wang, Hang
    Ni, Bingbing
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (03) : 1727 - 1741
  • [22] Multi-Source Attention for Unsupervised Domain Adaptation
    Cui, Xia
    Bollegala, Danushka
    1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 873 - 883
  • [23] Multi-Source Domain Adaptation with Mixture of Experts
    Guo, Jiang
    Shah, Darsh J.
    Barzilay, Regina
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4694 - 4703
  • [24] Multi-source Domain Adaptation for Face Recognition
    Yi, Haiyang
    Xu, Zhi
    Wen, Yimin
    Fan, Zhigang
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1349 - 1354
  • [25] Transformer Based Multi-Source Domain Adaptation
    Wright, Dustin
    Augenstein, Isabelle
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7963 - 7974
  • [26] Automatic online multi-source domain adaptation
    Renchunzi, Xie
    Pratama, Mahardhika
    INFORMATION SCIENCES, 2022, 582 : 480 - 494
  • [27] Multi-source domain adaptation for image classification
    Karimpour, Morvarid
    Noori Saray, Shiva
    Tahmoresnezhad, Jafar
    Pourmahmood Aghababa, Mohammad
    MACHINE VISION AND APPLICATIONS, 2020, 31 (06)
  • [28] Moment Matching for Multi-Source Domain Adaptation
    Peng, Xingchao
    Bai, Qinxun
    Xia, Xide
    Huang, Zijun
    Saenko, Kate
    Wang, Bo
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 1406 - 1415
  • [29] Subspace Identification for Multi-Source Domain Adaptation
    Li, Zijian
    Cai, Ruichu
    Chen, Guangyi
    Sun, Boyang
    Hao, Zhifeng
    Zhang, Kun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [30] Multi-source domain adaptation for image classification
    Morvarid Karimpour
    Shiva Noori Saray
    Jafar Tahmoresnezhad
    Mohammad Pourmahmood Aghababa
    Machine Vision and Applications, 2020, 31