Online MECG Compression Based on Incremental Tensor Decomposition for Wearable Devices

被引:3
|
作者
Xiao, Ling [1 ]
Zhang, Qian [2 ]
Xie, Kun [1 ]
Xiao, Chunxia [3 ]
机构
[1] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha 410082, Peoples R China
[2] Key Lab Embedded & Network Comp Hunan Prov, Changsha, Peoples R China
[3] Hunan Prov Peoples Hosp, Changsha, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-lead electrocardiogram (MECG); signal compression; tensor decomposition; wearable devices;
D O I
10.1109/JBHI.2020.3017790
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Lightweight and real-time multi-lead electrocardiogram (MECG) compression on wearable devices is important and challenging for long-term health monitoring. To utilize all three kinds of correlations of MECG data simultaneously, we construct 3-order incremental tensor and formulate data compression problem as tensor decomposition. However, the conventional tensor decomposition algorithms for large-scale tensor are usually too computationally expensive to apply to wearable devices. To reduce the computation complexity, we develop online compression approach by incremental tracking the CANDECOMP/PARAFAC (CP) decomposition of dynamic incremental tensors, which can efficiently utilize the tensor compression result based on the previous MECG data to derive the tensor compression upon arriving of new data. We evaluate the performance of our method with the Physikalisch-Technische Bundesanstalt MECG diagnostic dataset. Our method can achieve the averaged percentage root-mean-square difference (PRD) of 8.35%+/- 2.28% and the compression ratio (CR) of 43.05 +/- 2.01, which is better than five state-of-the-art of methods. Additionally, it can also well preserve the information of R-peak. Our method is suitable for near real-time MECG compression on wearable devices.
引用
收藏
页码:1041 / 1051
页数:11
相关论文
共 50 条
  • [41] Multi-Aspect Incremental Tensor Decomposition Based on Distributed In-Memory Big Data Systems
    Yang, Hye-Kyung
    Yong, Hwan-Seung
    JOURNAL OF DATA AND INFORMATION SCIENCE, 2020, 5 (02) : 13 - 32
  • [42] Multi-Aspect Incremental Tensor Decomposition Based on Distributed In-Memory Big Data Systems
    Hye-Kyung Yang
    Hwan-Seung Yong
    JournalofDataandInformationScience, 2020, 5 (02) : 13 - 32
  • [43] Deep Learning Model Compression With Rank Reduction in Tensor Decomposition
    Dai, Wei
    Fan, Jicong
    Miao, Yiming
    Hwang, Kai
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1315 - 1328
  • [44] A tensor decomposition approach to data compression and approximation of ND systems
    van Belzen, F.
    Weiland, S.
    MULTIDIMENSIONAL SYSTEMS AND SIGNAL PROCESSING, 2012, 23 (1-2) : 209 - 236
  • [45] A tensor decomposition approach to data compression and approximation of ND systems
    F. van Belzen
    S. Weiland
    Multidimensional Systems and Signal Processing, 2012, 23 : 209 - 236
  • [46] Deep Learning Model Compression With Rank Reduction in Tensor Decomposition
    Dai, Wei
    Fan, Jicong
    Miao, Yiming
    Hwang, Kai
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1315 - 1328
  • [47] Compression of Deep Neural Networks based on quantized tensor decomposition to implement on reconfigurable hardware platforms
    Nekooei, Amirreza
    Safari, Saeed
    NEURAL NETWORKS, 2022, 150 : 350 - 363
  • [48] Incremental net benefit of wearable devices for home monitoring of chronically ill patients
    Altamura, G.
    Nurchis, M. C.
    Santoli, G.
    Riccardi, M. T.
    Sapienza, M.
    Sessa, G.
    Damiani, G.
    EUROPEAN JOURNAL OF PUBLIC HEALTH, 2022, 32
  • [49] A Practical Approach for Employing Tensor Train Decomposition in Edge Devices
    Kokhazadeh, Milad
    Keramidas, Georgios
    Kelefouras, Vasilios
    Stamoulis, Iakovos
    INTERNATIONAL JOURNAL OF PARALLEL PROGRAMMING, 2024, 52 (1-2) : 20 - 39
  • [50] A Practical Approach for Employing Tensor Train Decomposition in Edge Devices
    Milad Kokhazadeh
    Georgios Keramidas
    Vasilios Kelefouras
    Iakovos Stamoulis
    International Journal of Parallel Programming, 2024, 52 : 20 - 39