Task Migration Based on Reinforcement Learning in Vehicular Edge Computing

被引:7
|
作者
Moon, Sungwon [1 ]
Park, Jaesung [2 ]
Lim, Yujin [1 ]
机构
[1] Sookmyung Womens Univ, Dept IT Engn, Seoul 04310, South Korea
[2] Kwangwoon Univ, Sch Informat Convergence, Seoul 01897, South Korea
基金
新加坡国家研究基金会;
关键词
SERVICE MIGRATION; OPTIMIZATION; MEC;
D O I
10.1155/2021/9929318
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multiaccess edge computing (MEC) has emerged as a promising technology for time-sensitive and computation-intensive tasks. With the high mobility of users, especially in a vehicular environment, computational task migration between vehicular edge computing servers (VECSs) has become one of the most critical challenges in guaranteeing quality of service (QoS) requirements. If the vehicle's tasks unequally migrate to specific VECSs, the performance can degrade in terms of latency and quality of service. Therefore, in this study, we define a computational task migration problem for balancing the loads of VECSs and minimizing migration costs. To solve this problem, we adopt a reinforcement learning algorithm in a cooperative VECS group environment that can collaborate with VECSs in the group. The objective of this study is to optimize load balancing and migration cost while satisfying the delay constraints of the computation task of vehicles. Simulations are performed to evaluate the performance of the proposed algorithm. The results show that compared to other algorithms, the proposed algorithm achieves approximately 20-40% better load balancing and approximately 13-28% higher task completion rate within the delay constraints.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Adaptive Task Offloading in Vehicular Edge Computing Networks: a Reinforcement Learning Based Scheme
    Jie Zhang
    Hongzhi Guo
    Jiajia Liu
    Mobile Networks and Applications, 2020, 25 : 1736 - 1745
  • [2] Trusted Task Offloading in Vehicular Edge Computing Networks: A Reinforcement Learning Based Solution
    Zhang, Lushi
    Guo, Hongzhi
    Zhou, Xiaoyi
    Liu, Jiajia
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 6711 - 6716
  • [3] Adaptive Task Offloading in Vehicular Edge Computing Networks: a Reinforcement Learning Based Scheme
    Zhang, Jie
    Guo, Hongzhi
    Liu, Jiajia
    MOBILE NETWORKS & APPLICATIONS, 2020, 25 (05): : 1736 - 1745
  • [4] Joint Task Offloading and Service Migration in RIS assisted Vehicular Edge Computing Network Based on Deep Reinforcement Learning
    Ning, Xiangrui
    Zeng, Ming
    Fei, Zesong
    2024 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS, ICNC, 2024, : 1037 - 1042
  • [5] Deep Reinforcement Learning-Based Task Offloading and Load Balancing for Vehicular Edge Computing
    Wu, Zhoupeng
    Jia, Zongpu
    Pang, Xiaoyan
    Zhao, Shan
    ELECTRONICS, 2024, 13 (08)
  • [6] Federated Deep Reinforcement Learning Based Task Offloading with Power Control in Vehicular Edge Computing
    Moon, Sungwon
    Lim, Yujin
    SENSORS, 2022, 22 (24)
  • [7] Dynamic Vehicle Aware Task Offloading Based on Reinforcement Learning in a Vehicular Edge Computing Network
    Wang, Lingling
    Zhu, Xiumin
    Li, Nianxin
    Li, Yumei
    Ma, Shuyue
    Zhai, Linbo
    2022 18TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN, 2022, : 263 - 270
  • [8] Prioritized Task Offloading in Vehicular Edge Computing Using Deep Reinforcement Learning
    Uddin, Ashab
    Sakr, Ahmed Hamdi
    Zhang, Ning
    2024 IEEE 99TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2024-SPRING, 2024,
  • [9] Task offloading in vehicular edge computing networks via deep reinforcement learning
    Karimi, Elham
    Chen, Yuanzhu
    Akbari, Behzad
    COMPUTER COMMUNICATIONS, 2022, 189 : 193 - 204
  • [10] Meta Reinforcement Learning for Multi-Task Offloading in Vehicular Edge Computing
    Dai, Penglin
    Huang, Yaorong
    Hu, Kaiwen
    Wu, Xiao
    Xing, Huanlai
    Yu, Zhaofei
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (03) : 2123 - 2138