TransEHR: Self-Supervised Transformer for Clinical Time Series Data

被引:0
|
作者
Xu, Yanbo [1 ]
Xu, Shangqing [1 ]
Ramprassad, Manav [1 ]
Tumanov, Alexey [1 ]
Zhang, Chao [1 ]
机构
[1] Georgia Inst Technol, Atlanta, GA 30332 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks, including the Transformer architecture, have achieved remarkable performance in various time series tasks. However, their effectiveness in handling clinical time series data is hindered by specific challenges: 1) Sparse event sequences collected asynchronously with multivariate time series, and 2) Limited availability of labeled data. To address these challenges, we propose TransEHR1, a self-supervised Transformer model designed to encode multi-sourced asynchronous sequential data, such as structured Electronic Health Records (EHRs), efficiently. We introduce three pretext tasks for pre-training the Transformer model, utilizing large amounts of unlabeled structured EHR data, followed by fine-tuning on downstream prediction tasks using the limited labeled data. Through extensive experiments on three real-world health datasets, we demonstrate that our model achieves state-of-the-art performance on benchmark clinical tasks, including in-hospital mortality classification, phenotyping, and length-of-stay prediction. Our findings highlight the efficacy of TransEHR in effectively addressing the challenges associated with clinical time series data, thus contributing to advancements in healthcare analytics.
引用
收藏
页码:623 / 635
页数:13
相关论文
共 50 条
  • [41] MST: Masked Self-Supervised Transformer for Visual Representation
    Li, Zhaowen
    Chen, Zhiyang
    Yang, Fan
    Li, Wei
    Zhu, Yousong
    Zhao, Chaoyang
    Deng, Rui
    Wu, Liwei
    Zhao, Rui
    Tang, Ming
    Wang, Jinqiao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [42] Self-Supervised Image Aesthetic Assessment Based on Transformer
    Jia, Minrui
    Wang, Guangao
    Wang, Zibei
    Yang, Shuai
    Ke, Yongzhen
    Wang, Kai
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2025, 24 (01)
  • [43] Self-Supervised Contrastive Representation Learning for Semi-Supervised Time-Series Classification
    Eldele, Emadeldeen
    Ragab, Mohamed
    Chen, Zhenghua
    Wu, Min
    Kwoh, Chee-Keong
    Li, Xiaoli
    Guan, Cuntai
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 15604 - 15618
  • [44] Transformer encoder based self-supervised learning for HVAC fault detection with unlabeled data
    Abdollah, M. A. F.
    Scoccia, R.
    Aprile, M.
    BUILDING AND ENVIRONMENT, 2024, 258
  • [45] MAD: Self-Supervised Masked Anomaly Detection Task for Multivariate Time Series
    Fu, Yiwei
    Xue, Feng
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [46] A self-supervised contrastive change point detection method for industrial time series
    Bao, Xiangyu
    Chen, Liang
    Zhong, Jingshu
    Wu, Dianliang
    Zheng, Yu
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [47] Self-supervised multi-transformation learning for time series anomaly detection
    Han, Han
    Fan, Haoyi
    Huang, Xunhua
    Han, Chuang
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 253
  • [48] Efficient time series anomaly detection by multiresolution self-supervised discriminative network
    Huang, Desen
    Shen, Lifeng
    Yu, Zhongzhong
    Zheng, Zhenjing
    Huang, Min
    Ma, Qianli
    Neurocomputing, 2022, 491 : 261 - 272
  • [49] Efficient time series anomaly detection by multiresolution self-supervised discriminative network
    Huang, Desen
    Shen, Lifeng
    Yu, Zhongzhong
    Zheng, Zhenjing
    Huang, Min
    Ma, Qianli
    NEUROCOMPUTING, 2022, 491 : 261 - 272
  • [50] TimeCLR: A self-supervised contrastive learning framework for univariate time series representation
    Yang, Xinyu
    Zhang, Zhenguo
    Cui, Rongyi
    KNOWLEDGE-BASED SYSTEMS, 2022, 245