Predictive Energy-Aware Adaptive Sampling with Deep Reinforcement Learning

被引:2
|
作者
Heo, Seonyeong [1 ]
Mayer, Philipp [1 ]
Magno, Michele [1 ]
机构
[1] Swiss Fed Inst Technol, Dept Informat Technol & Elect Engn, Zurich, Switzerland
关键词
Adaptive sampling; energy harvesting; energy management; wireless smart sensors; reinforcement learning;
D O I
10.1109/ICECS202256217.2022.9971120
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Energy harvesting can enable wireless smart sensors to be self-sustainable by allowing them to gather energy from the environment. However, since the energy availability changes dynamically depending on the environment, it is difficult to find an optimal energy management strategy at design time. One existing approach to reflecting dynamic energy availability is energy-aware adaptive sampling, which changes the sampling rate of a sensor according to the energy state. This work proposes deep reinforcement learning-based predictive adaptive sampling for a wireless sensor node. The proposed approach applies deep reinforcement learning to find an effective adaptive sampling strategy based on the harvesting power and energy level. In addition, the proposed approach enables predictive adaptive sampling by designing adaptive sampling models that consider the trend of energy state. The evaluation results show that the predictive models can successfully manage the energy budget reflecting dynamic energy availability, maintaining a stable energy state for a up to 11.5% longer time.
引用
收藏
页数:4
相关论文
共 50 条
  • [1] GreenABR: Energy-Aware Adaptive Bitrate Streaming with Deep Reinforcement Learning
    Turkkan, Bekir Oguzhan
    Dai, Ting
    Raman, Adithya
    Kosar, Tevfik
    Chen, Changyou
    Bulut, Muhammed Fatih
    Zola, Jaroslaw
    Sow, Daby
    PROCEEDINGS OF THE 13TH ACM MULTIMEDIA SYSTEMS CONFERENCE, MMSYS 2022, 2022, : 150 - 163
  • [2] Energy-aware Multiple Access Using Deep Reinforcement Learning
    Mazandarani, Hamid Reza
    Khorsandi, Siavash
    2021 29TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2021, : 521 - 525
  • [3] Reinforcement Learning and Energy-Aware Routing
    Frohlich, Piotr
    Gelenbe, Erol
    Nowak, Mateusz
    PROCEEDINGS OF THE 4TH FLEXNETS WORKSHOP ON FLEXIBLE NETWORKS, ARTIFICIAL INTELLIGENCE SUPPORTED NETWORK FLEXIBILITY AND AGILITY (FLEXNETS'21), 2021, : 26 - 31
  • [4] Energy-aware Adaptive Approximate Computing for Deep Learning Applications
    TaheriNejad, Nima
    Shakibhamedan, Salar
    2022 IEEE COMPUTER SOCIETY ANNUAL SYMPOSIUM ON VLSI (ISVLSI 2022), 2022, : 328 - 328
  • [5] SFC Consolidation: Energy-aware SFC Management using Deep Reinforcement Learning
    Jeong, Eui-Dong
    Yoo, Jae-Hyoung
    Hong, James Won-Ki
    PROCEEDINGS OF 2024 IEEE/IFIP NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM, NOMS 2024, 2024,
  • [6] Energy-Aware Deep Reinforcement Learning Scheduling for Sensors Correlated in Time and Space
    Hribar, Jernej
    Marinescu, Andrei
    Chiumento, Alessandro
    Dasilva, Luiz A.
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (09) : 6732 - 6744
  • [7] Energy-aware scheduling for spark job based on deep reinforcement learning in cloud
    Li, Hongjian
    Lu, Liang
    Shi, Wenhu
    Tan, Gangfan
    Luo, Hao
    COMPUTING, 2023, 105 (08) : 1717 - 1743
  • [8] Energy-aware scheduling for spark job based on deep reinforcement learning in cloud
    Hongjian Li
    Liang Lu
    Wenhu Shi
    Gangfan Tan
    Hao Luo
    Computing, 2023, 105 : 1717 - 1743
  • [9] Energy-Aware Design Policy for Network Slicing Using Deep Reinforcement Learning
    Wang, Ranyin
    Friderikos, Vasilis
    Aghvami, A. Hamid
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (05) : 2378 - 2391
  • [10] Energy-aware Scheduling of Jobs in Heterogeneous Cluster Systems Using Deep Reinforcement Learning
    Esmaili, Amirhossein
    Pedram, Massoud
    PROCEEDINGS OF THE TWENTYFIRST INTERNATIONAL SYMPOSIUM ON QUALITY ELECTRONIC DESIGN (ISQED 2020), 2020, : 426 - 431