Denoising Knowledge Transfer Model for Zero-Shot MRI Reconstruction

被引:0
|
作者
Hou, Ruizhi [1 ]
Li, Fang [2 ,3 ]
机构
[1] Xian Univ Sci & Technol, Coll Comp Sci & Technol, Shaanxi 710054, Peoples R China
[2] East China Normal Univ, Sch Math Sci, Key Lab MEA, Minist Educ, Shanghai 200241, Peoples R China
[3] East China Normal Univ, Shanghai Key Lab PMMP, Shanghai 200241, Peoples R China
基金
上海市自然科学基金;
关键词
Image reconstruction; Training; Noise reduction; Zero shot learning; Magnetic resonance imaging; Hands; Ensemble learning; Computational modeling; Uncertainty; Training data; Diffusion model; ensemble learning; MRI reconstruction; plug-and-play; zero-shot learning; IMAGE; REGULARIZATION; PRIORS; SENSE; PLUG;
D O I
10.1109/TCI.2025.3525960
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Though fully-supervised deep learning methods have made remarkable achievements in accelerated magnetic resonance imaging (MRI) reconstruction, the fully-sampled or high-quality data is unavailable in many scenarios. Zero-shot learning enables training on under-sampled data. However, the limited information in under-sampled data inhibits the neural network from realizing its full potential. This paper proposes a novel learning framework to enhance the diversity of the learned prior in zero-shot learning and improve the reconstruction quality. It consists of three stages: multi-weighted zero-shot ensemble learning, denoising knowledge transfer, and model-guided reconstruction. In the first stage, the ensemble models are trained using a multi-weighted loss function in k-space, yielding results with higher quality and diversity. In the second stage, we propose to use the deep denoiser to distill the knowledge in the ensemble models. Additionally, the denoiser is initialized using weights pre-trained on nature images, combining external knowledge with the information from under-sampled data. In the third stage, the denoiser is plugged into the iteration algorithm to produce the final reconstructed image. Extensive experiments demonstrate that our proposed framework surpasses existing zero-shot methods and can flexibly adapt to different datasets. In multi-coil reconstruction, our proposed zero-shot learning framework outperforms the state-of-the-art denoising-based methods.
引用
收藏
页码:52 / 64
页数:13
相关论文
共 50 条
  • [41] Schema Networks: Zero-shot Transfer with a Generative Causal Model of Intuitive Physics
    Kansky, Ken
    Silver, Tom
    Mely, David A.
    Eldawy, Mohamed
    Lazaro-Gredilla, Miguel
    Lou, Xinghua
    Dorfman, Nimrod
    Sidor, Szymon
    Phoenix, Scott
    George, Dileep
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [42] ZERO-SHOT PRONUNCIATION LEXICONS FOR CROSS-LANGUAGE ACOUSTIC MODEL TRANSFER
    Wiesner, Matthew
    Adams, Oliver
    Yarowsky, David
    Trmal, Jan
    Khudanpur, Sanjeev
    2019 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU 2019), 2019, : 1048 - 1054
  • [43] Speech Enhancement with Zero-Shot Model Selection
    Zezario, Ryandhimas E.
    Fuh, Chiou-Shann
    Wang, Hsin-Min
    Tsao, Yu
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 491 - 495
  • [44] A Joint Generative Model for Zero-Shot Learning
    Gao, Rui
    Hou, Xingsong
    Qin, Jie
    Liu, Li
    Zhu, Fan
    Zhang, Zhao
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT IV, 2019, 11132 : 631 - 646
  • [45] Model Selection for Generalized Zero-Shot Learning
    Zhang, Hongguang
    Koniusz, Piotr
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT II, 2019, 11130 : 198 - 204
  • [46] Attribute fusion transfer for zero-shot fault diagnosis
    Fan, Linchuan
    Chen, Xiaolong
    Chai, Yi
    Lin, Wenyi
    ADVANCED ENGINEERING INFORMATICS, 2023, 58
  • [47] DARLA: Improving Zero-Shot Transfer in Reinforcement Learning
    Higgins, Irina
    Pal, Arka
    Rusu, Andrei
    Matthey, Loic
    Burgess, Christopher
    Pritzel, Alexander
    Botyinick, Matthew
    Blundell, Charles
    Lerchner, Alexander
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [48] Constrained GPI for Zero-Shot Transfer in Reinforcement Learning
    Kim, Jaekyeom
    Park, Seohong
    Kim, Gunhee
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [49] Zero-Shot Adaptive Transfer for Conversational Language Understanding
    Lee, Sungjin
    Jha, Rahul
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 6642 - 6649
  • [50] Structurally Constrained Correlation Transfer for Zero-shot Learning
    Chen, Yu
    Xiong, Yuehan
    Gao, Xing
    Xiong, Hongkai
    2018 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (IEEE VCIP), 2018,