Predictive Energy-Aware Adaptive Sampling with Deep Reinforcement Learning

被引:2
|
作者
Heo, Seonyeong [1 ]
Mayer, Philipp [1 ]
Magno, Michele [1 ]
机构
[1] Swiss Fed Inst Technol, Dept Informat Technol & Elect Engn, Zurich, Switzerland
关键词
Adaptive sampling; energy harvesting; energy management; wireless smart sensors; reinforcement learning;
D O I
10.1109/ICECS202256217.2022.9971120
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Energy harvesting can enable wireless smart sensors to be self-sustainable by allowing them to gather energy from the environment. However, since the energy availability changes dynamically depending on the environment, it is difficult to find an optimal energy management strategy at design time. One existing approach to reflecting dynamic energy availability is energy-aware adaptive sampling, which changes the sampling rate of a sensor according to the energy state. This work proposes deep reinforcement learning-based predictive adaptive sampling for a wireless sensor node. The proposed approach applies deep reinforcement learning to find an effective adaptive sampling strategy based on the harvesting power and energy level. In addition, the proposed approach enables predictive adaptive sampling by designing adaptive sampling models that consider the trend of energy state. The evaluation results show that the predictive models can successfully manage the energy budget reflecting dynamic energy availability, maintaining a stable energy state for a up to 11.5% longer time.
引用
收藏
页数:4
相关论文
共 50 条
  • [31] ENERGY-AWARE ADAPTIVE OFDM SYSTEMS
    Emre, Y.
    Chakrabarti, C.
    2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 1590 - 1593
  • [32] A Reinforcement Learning Approach for Cost- and Energy-Aware Mobile Data Offloading
    Zhang, Cheng
    Gu, Bo
    Liu, Zhi
    Yamori, Kyoko
    Tanaka, Yoshiaki
    2016 18TH ASIA-PACIFIC NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM (APNOMS), 2016,
  • [33] Near-optimal reinforcement learning framework for energy-aware sensor communications
    Pandana, C
    Liu, KJR
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2005, 23 (04) : 788 - 797
  • [34] Data-driven Energy-efficient Adaptive Sampling Using Deep Reinforcement Learning
    Demirel B.U.
    Chen L.
    Al Faruque M.A.
    ACM Transactions on Computing for Healthcare, 2023, 4 (03):
  • [35] Energy-Aware Deep Learning for Green Cyber-Physical Systems
    Puangpontip, Supadchaya
    Hewett, Rattikorn
    PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON SMART CITIES AND GREEN ICT SYSTEMS (SMARTGREENS), 2022, : 32 - 43
  • [36] Energy-Aware MPTCP Scheduling in Heterogeneous Wireless Networks Using Multi-Agent Deep Reinforcement Learning Techniques
    Arain, Zulfiqar Ali
    Qiu, Xuesong
    Xu, Changqiao
    Wang, Mu
    Abdul Rahim, Mussadiq
    ELECTRONICS, 2023, 12 (21)
  • [37] Energy-aware systems for real-time job scheduling in cloud data centers: A deep reinforcement learning approach
    Yan, Jingchen
    Huang, Yifeng
    Gupta, Aditya
    Gupta, Anubhav
    Liu, Cong
    Li, Jianbin
    Cheng, Long
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 99
  • [38] A Deep Reinforcement Learning Based Approach for Cost- and Energy-Aware Multi-Flow Mobile Data Offloading
    Zhang, Cheng
    Liu, Zhi
    Gu, Bo
    Yamori, Kyoko
    Tanaka, Yoshiaki
    IEICE TRANSACTIONS ON COMMUNICATIONS, 2018, E101B (07) : 1625 - 1634
  • [39] Attention-Aware Sampling via Deep Reinforcement Learning for Action Recognition
    Dong, Wenkai
    Zhang, Zhaoxiang
    Tan, Tieniu
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 8247 - 8254
  • [40] Energy-Aware Adaptive Sectorisation in LTE Systems
    Qi, Yinan
    Imran, Muhammad Ali
    Tafazolli, Rahim
    2011 IEEE 22ND INTERNATIONAL SYMPOSIUM ON PERSONAL INDOOR AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2011, : 2402 - 2406