Population-Specific Glucose Prediction in Diabetes Care With Transformer-Based Deep Learning on the Edge

被引:13
|
作者
Zhu, Taiyu [1 ,2 ]
Kuang, Lei [1 ]
Piao, Chengzhe [3 ]
Zeng, Junming [1 ]
Li, Kezhi [3 ]
Georgiou, Pantelis [1 ]
机构
[1] Imperial Coll London, Ctr Bioinspired Technol, Dept Elect & Elect Engn, London SW7 2BX, England
[2] Univ Oxford, Dept Psychiat, Oxford OX1 2JD, England
[3] UCL, Inst Hlth Informat, London SW7 2BX, England
关键词
Diabetes; Thin film transistors; Glucose; Transformers; Predictive models; Computational modeling; Deep learning; Artificial intelligence; deep learning; diabetes; edge computing; glucose prediction; low power wearable device; transformer; DAILY INSULIN INJECTIONS; HYPOGLYCEMIA; ADULTS;
D O I
10.1109/TBCAS.2023.3348844
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Leveraging continuous glucose monitoring (CGM) systems, real-time blood glucose (BG) forecasting is essential for proactive interventions, playing a crucial role in enhancing the management of type 1 diabetes (T1D) and type 2 diabetes (T2D). However, developing a model generalized to a population and subsequently embedding it within a microchip of a wearable device presents significant technical challenges. Furthermore, the domain of BG prediction in T2D remains under-explored in the literature. In light of this, we propose a population-specific BG prediction model, leveraging the capabilities of the temporal fusion Transformer (TFT) to adjust predictions based on personal demographic data. Then the trained model is embedded within a system-on-chip, integral to our low-power and low-cost customized wearable device. This device seamlessly communicates with CGM systems through Bluetooth and provides timely BG predictions using edge computing. When evaluated on two publicly available clinical datasets with a total of 124 participants with T1D or T2D, the embedded TFT model consistently demonstrated superior performance, achieving the lowest prediction errors when compared with a range of machine learning baseline methods. Executing the TFT model on our wearable device requires minimal memory and power consumption, enabling continuous decision support for more than 51 days on a single Li-Poly battery charge. These findings demonstrate the significant potential of the proposed TFT model and wearable device in enhancing the quality of life for people with diabetes and effectively addressing real-world challenges.
引用
收藏
页码:236 / 246
页数:11
相关论文
共 50 条
  • [21] VGG-TSwinformer: Transformer-based deep learning model for early Alzheimer?s disease prediction
    Hu, Zhentao
    Wang, Zheng
    Jin, Yong
    Hou, Wei
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2023, 229
  • [22] EPI-Trans: an effective transformer-based deep learning model for enhancer promoter interaction prediction
    Ahmed, Fatma S.
    Aly, Saleh
    Liu, Xiangrong
    BMC BIOINFORMATICS, 2024, 25 (01):
  • [23] BEHRT-HF: an interpretable transformer-based, deep learning model for prediction of incident heart failure
    Rao, S.
    Li, Y.
    Ramakrishnan, R.
    Hassaine, A.
    Canoy, D.
    Zhu, Y.
    Salimi-Khorshidi, G.
    Rahimi, K.
    EUROPEAN HEART JOURNAL, 2020, 41 : 3553 - 3553
  • [24] Transformer-Based Deep Learning Model for Stock Price Prediction: A Case Study on Bangladesh Stock Market
    Muhammad, Tashreef
    Aftab, Anika Bintee
    Ibrahim, Muhammad
    Ahsan, Md. Mainul
    Muhu, Maishameem Meherin
    Khan, Shahidul Islam
    Alam, Mohammad Shafiul
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2023, 22 (03)
  • [25] Patent image retrieval using transformer-based deep metric learning
    Higuchi, Kotaro
    Yanai, Keiji
    WORLD PATENT INFORMATION, 2023, 74
  • [26] Transformer-based deep learning models for predicting permeability of porous media
    Meng, Yinquan
    Jiang, Jianguo
    Wu, Jichun
    Wang, Dong
    ADVANCES IN WATER RESOURCES, 2023, 179
  • [27] Transformer-Based Deep Learning Architecture for Improved Cardiac Substructure Segmentation
    Summerfield, N.
    Qiu, J.
    Hossain, S.
    Dong, M.
    Glide-Hurst, C.
    MEDICAL PHYSICS, 2022, 49 (06) : E525 - E526
  • [28] Transformer-based deep learning for predicting protein properties in the life sciences
    Chandra, Abel
    Tunnermann, Laura
    Lofstedt, Tommy
    Gratz, Regina
    ELIFE, 2023, 12
  • [29] Automatic identification of suicide notes with a transformer-based deep learning model
    Zhang, Tianlin
    Schoene, Annika M.
    Ananiadou, Sophia
    INTERNET INTERVENTIONS-THE APPLICATION OF INFORMATION TECHNOLOGY IN MENTAL AND BEHAVIOURAL HEALTH, 2021, 25
  • [30] Transformer-Based Deep Learning Network for Tooth Segmentation on Panoramic Radiographs
    Sheng Chen
    Wang Lin
    Huang Zhenhuan
    Wang Tian
    Guo Yalin
    Hou Wenjie
    Xu Laiqing
    Wang Jiazhu
    Yan Xue
    JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2023, 36 (01) : 257 - 272