Population-Specific Glucose Prediction in Diabetes Care With Transformer-Based Deep Learning on the Edge

被引:13
|
作者
Zhu, Taiyu [1 ,2 ]
Kuang, Lei [1 ]
Piao, Chengzhe [3 ]
Zeng, Junming [1 ]
Li, Kezhi [3 ]
Georgiou, Pantelis [1 ]
机构
[1] Imperial Coll London, Ctr Bioinspired Technol, Dept Elect & Elect Engn, London SW7 2BX, England
[2] Univ Oxford, Dept Psychiat, Oxford OX1 2JD, England
[3] UCL, Inst Hlth Informat, London SW7 2BX, England
关键词
Diabetes; Thin film transistors; Glucose; Transformers; Predictive models; Computational modeling; Deep learning; Artificial intelligence; deep learning; diabetes; edge computing; glucose prediction; low power wearable device; transformer; DAILY INSULIN INJECTIONS; HYPOGLYCEMIA; ADULTS;
D O I
10.1109/TBCAS.2023.3348844
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Leveraging continuous glucose monitoring (CGM) systems, real-time blood glucose (BG) forecasting is essential for proactive interventions, playing a crucial role in enhancing the management of type 1 diabetes (T1D) and type 2 diabetes (T2D). However, developing a model generalized to a population and subsequently embedding it within a microchip of a wearable device presents significant technical challenges. Furthermore, the domain of BG prediction in T2D remains under-explored in the literature. In light of this, we propose a population-specific BG prediction model, leveraging the capabilities of the temporal fusion Transformer (TFT) to adjust predictions based on personal demographic data. Then the trained model is embedded within a system-on-chip, integral to our low-power and low-cost customized wearable device. This device seamlessly communicates with CGM systems through Bluetooth and provides timely BG predictions using edge computing. When evaluated on two publicly available clinical datasets with a total of 124 participants with T1D or T2D, the embedded TFT model consistently demonstrated superior performance, achieving the lowest prediction errors when compared with a range of machine learning baseline methods. Executing the TFT model on our wearable device requires minimal memory and power consumption, enabling continuous decision support for more than 51 days on a single Li-Poly battery charge. These findings demonstrate the significant potential of the proposed TFT model and wearable device in enhancing the quality of life for people with diabetes and effectively addressing real-world challenges.
引用
收藏
页码:236 / 246
页数:11
相关论文
共 50 条
  • [41] RPConvformer: A novel Transformer-based deep neural networks for traffic flow prediction
    Wen, Yanjie
    Xu, Ping
    Li, Zhihong
    Xu, Wangtu
    Wang, Xiaoyu
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 218
  • [42] Real-time prediction of TBM penetration rates using a transformer-based ensemble deep learning model
    Zhang, Minggong
    Ji, Ankang
    Zhou, Chang
    Ding, Yuexiong
    Wang, Luqi
    AUTOMATION IN CONSTRUCTION, 2024, 168
  • [43] LncLocFormer: a Transformer-based deep learning model for multi-label lncRNA subcellular localization prediction by using localization-specific attention mechanism
    Zeng, Min
    Wu, Yifan
    Li, Yiming
    Yin, Rui
    Lu, Chengqian
    Duan, Junwen
    Li, Min
    BIOINFORMATICS, 2023, 39 (12)
  • [44] Transformer-based ensemble deep learning model for EEG-based emotion recognition
    Xiaopeng Si
    Dong Huang
    Yulin Sun
    Shudi Huang
    He Huang
    Dong Ming
    Brain Science Advances, 2023, 9 (03) : 210 - 223
  • [45] Transformer-based deep learning models for adsorption capacity prediction of heavy metal ions toward biochar-based adsorbents
    Jaffari, Zeeshan Haider
    Abbas, Ather
    Kim, Chang -Min
    Shin, Jaegwan
    Kwak, Jinwoo
    Son, Changgil
    Lee, Yong-Gu
    Kim, Sangwon
    Chon, Kangmin
    Cho, Kyung Hwa
    JOURNAL OF HAZARDOUS MATERIALS, 2024, 462
  • [46] Transformer-based deep learning method for optimizing ADMET properties of lead compounds
    Yang, Lijuan
    Jin, Chao
    Yang, Guanghui
    Bing, Zhitong
    Huang, Liang
    Niu, Yuzhen
    Yang, Lei
    PHYSICAL CHEMISTRY CHEMICAL PHYSICS, 2023, 25 (03) : 2377 - 2385
  • [47] Identifying suicidal emotions on social media through transformer-based deep learning
    Dheeraj Kodati
    Ramakrishnudu Tene
    Applied Intelligence, 2023, 53 : 11885 - 11917
  • [48] Transformer-based deep imitation learning for dual-arm robot manipulation
    Kim, Heecheol
    Ohmura, Yoshiyuki
    Kuniyoshi, Yasuo
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 8965 - 8972
  • [49] Contextualized medication information extraction using Transformer-based deep learning architectures
    Chen, Aokun
    Yu, Zehao
    Yang, Xi
    Guo, Yi
    Bian, Jiang
    Wu, Yonghui
    JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 142
  • [50] Application of Deep Learning in Generating Structured Radiology Reports: A Transformer-Based Technique
    Seyed Ali Reza Moezzi
    Abdolrahman Ghaedi
    Mojdeh Rahmanian
    Seyedeh Zahra Mousavi
    Ashkan Sami
    Journal of Digital Imaging, 2023, 36 : 80 - 90