Predictive Energy-Aware Adaptive Sampling with Deep Reinforcement Learning

被引:2
|
作者
Heo, Seonyeong [1 ]
Mayer, Philipp [1 ]
Magno, Michele [1 ]
机构
[1] Swiss Fed Inst Technol, Dept Informat Technol & Elect Engn, Zurich, Switzerland
关键词
Adaptive sampling; energy harvesting; energy management; wireless smart sensors; reinforcement learning;
D O I
10.1109/ICECS202256217.2022.9971120
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Energy harvesting can enable wireless smart sensors to be self-sustainable by allowing them to gather energy from the environment. However, since the energy availability changes dynamically depending on the environment, it is difficult to find an optimal energy management strategy at design time. One existing approach to reflecting dynamic energy availability is energy-aware adaptive sampling, which changes the sampling rate of a sensor according to the energy state. This work proposes deep reinforcement learning-based predictive adaptive sampling for a wireless sensor node. The proposed approach applies deep reinforcement learning to find an effective adaptive sampling strategy based on the harvesting power and energy level. In addition, the proposed approach enables predictive adaptive sampling by designing adaptive sampling models that consider the trend of energy state. The evaluation results show that the predictive models can successfully manage the energy budget reflecting dynamic energy availability, maintaining a stable energy state for a up to 11.5% longer time.
引用
收藏
页数:4
相关论文
共 50 条
  • [21] Availability-aware and energy-aware dynamic SFC placement using reinforcement learning
    Guto Leoni Santos
    Theo Lynn
    Judith Kelner
    Patricia Takako Endo
    The Journal of Supercomputing, 2021, 77 : 12711 - 12740
  • [22] Energy-Aware Task Allocation for Mobile IoT by Online Reinforcement Learning
    Yao, Jingjing
    Ansari, Nirwan
    ICC 2019 - 2019 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2019,
  • [23] Graph Convolutional Reinforcement Learning for Advanced Energy-Aware Process Planning
    Xiao, Qinge
    Niu, Ben
    Xue, Bing
    Hu, Luoke
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2023, 53 (05): : 2802 - 2814
  • [24] Energy-Aware Federated Learning With Distributed User Sampling and Multichannel ALOHA
    da Silva, Rafael Valente
    Lopez, Onel L. Alcaraz
    Souza, Richard Demo
    IEEE COMMUNICATIONS LETTERS, 2023, 27 (10) : 2867 - 2871
  • [25] Energy-Aware Hierarchical Reinforcement Learning Based on the Predictive Energy Consumption Algorithm for Search and Rescue Aerial Robots in Unknown Environments
    Ramezani, M.
    Atashgah, M. A. Amiri
    DRONES, 2024, 8 (07)
  • [26] Deep Reinforcement Learning Empowers Wireless Powered Mobile Edge Computing: Towards Energy-Aware Online Offloading
    Jiao, Xianlong
    Wang, Yating
    Guo, Songtao
    Zhang, Hong
    Dai, Haipeng
    Li, Mingyan
    Zhou, Pengzhan
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2023, 71 (09) : 5214 - 5227
  • [27] Toward Intelligent Connected E-Mobility: Energy-Aware Cooperative Driving With Deep Multiagent Reinforcement Learning
    He, Xiangkun
    Lv, Chen
    IEEE VEHICULAR TECHNOLOGY MAGAZINE, 2023, 18 (03): : 101 - 109
  • [28] Co-Evolution With Deep Reinforcement Learning for Energy-Aware Distributed Heterogeneous Flexible Job Shop Scheduling
    Li, Rui
    Gong, Wenyin
    Wang, Ling
    Lu, Chao
    Dong, Chenxin
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2024, 54 (01): : 201 - 211
  • [29] Energy-aware task scheduling and offloading using deep reinforcement learning in SDN-enabled IoT network
    Sellami, Bassem
    Hakiri, Akram
    Ben Yahia, Sadok
    Berthou, Pascal
    COMPUTER NETWORKS, 2022, 210
  • [30] Energy-aware task scheduling and offloading using deep reinforcement learning in SDN-enabled IoT network
    Sellami, Bassem
    Hakiri, Akram
    Yahia, Sadok Ben
    Berthou, Pascal
    Computer Networks, 2022, 210