Accelerated MR parameter mapping with low-rank and sparsity constraints

被引:150
|
作者
Zhao, Bo [1 ,2 ]
Lu, Wenmiao [2 ]
Hitchens, T. Kevin [3 ,4 ]
Lam, Fan [1 ,2 ]
Ho, Chien [3 ,4 ]
Liang, Zhi-Pei [1 ,2 ]
机构
[1] Univ Illinois, Dept Elect & Comp Engn, Urbana, IL USA
[2] Univ Illinois, Beckman Inst Adv Sci & Technol, Urbana, IL USA
[3] Carnegie Mellon Univ, Pittsburgh NMR Ctr Biomed Res, Pittsburgh, PA 15213 USA
[4] Carnegie Mellon Univ, Dept Biol Sci, Pittsburgh, PA 15213 USA
关键词
constrained reconstruction; low-rank constraint; joint sparsity constraint; parameter mapping; T-1; mapping; T-2; RECONSTRUCTION; T-1; REGULARIZATION;
D O I
10.1002/mrm.25421
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
PurposeTo enable accurate magnetic resonance (MR) parameter mapping with accelerated data acquisition, utilizing recent advances in constrained imaging with sparse sampling. Theory and MethodsA new constrained reconstruction method based on low-rank and sparsity constraints is proposed to accelerate MR parameter mapping. More specifically, the proposed method simultaneously imposes low-rank and joint sparse structures on contrast-weighted image sequences within a unified mathematical formulation. With a pre-estimated subspace, this formulation results in a convex optimization problem, which is solved using an efficient numerical algorithm based on the alternating direction method of multipliers. ResultsTo evaluate the performance of the proposed method, two application examples were considered: (i) T-2 mapping of the human brain and (ii) T-1 mapping of the rat brain. For each application, the proposed method was evaluated at both moderate and high acceleration levels. Additionally, the proposed method was compared with two state-of-the-art methods that only use a single low-rank or joint sparsity constraint. The results demonstrate that the proposed method can achieve accurate parameter estimation with both moderately and highly undersampled data. Although all methods performed fairly well with moderately undersampled data, the proposed method achieved much better performance (e.g., more accurate parameter values) than the other two methods with highly undersampled data. ConclusionsSimultaneously imposing low-rank and sparsity constraints can effectively improve the accuracy of fast MR parameter mapping with sparse sampling. Magn Reson Med 74:489-498, 2015. (c) 2014 Wiley Periodicals, Inc.
引用
收藏
页码:489 / 498
页数:10
相关论文
共 50 条
  • [31] Enhanced Sparsity Prior Model for Low-Rank Tensor Completion
    Xue, Jize
    Zhao, Yongqiang
    Liao, Wenzhi
    Chan, Jonathan Cheung-Wai
    Kong, Seong G.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (11) : 4567 - 4581
  • [32] Sparsity and locally low rank regularization for MR fingerprinting
    da Cruz, Gastao Lima
    Bustin, Aurelien
    Jaubert, Oliver
    Schneider, Torben
    Botnar, Rene M.
    Prieto, Claudia
    MAGNETIC RESONANCE IN MEDICINE, 2019, 81 (06) : 3530 - 3543
  • [33] Generalized Low-Rank Update: Model Parameter Bounds for Low-Rank Training Data Modifications
    Hanada, Hiroyuki
    Hashimoto, Noriaki
    Taji, Kouichi
    Takeuchi, Ichiro
    NEURAL COMPUTATION, 2023, 35 (12) : 1970 - 2005
  • [34] Low-Rank PSD Approximation in Input-Sparsity Time
    Clarkson, Kenneth L.
    Woodruff, David P.
    PROCEEDINGS OF THE TWENTY-EIGHTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2017, : 2061 - 2072
  • [35] Is Input Sparsity Time Possible for Kernel Low-Rank Approximation?
    Musco, Cameron
    Woodruff, David P.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [36] Unsupervised feature extraction by low-rank and sparsity preserving embedding
    Zhan, Shanhua
    Wu, Jigang
    Han, Na
    Wen, Jie
    Fang, Xiaozhao
    NEURAL NETWORKS, 2019, 109 : 56 - 66
  • [37] RECONCILIATION OF GROUP SPARSITY AND LOW-RANK MODELS FOR IMAGE RESTORATION
    Zha, Zhiyuan
    Wen, Bihan
    Yuan, Xin
    Zhou, Jiantao
    Zhu, Ce
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [38] MULTIVARIATE LINEAR REGRESSION WITH LOW-RANK AND ROW-SPARSITY
    SUN, Jun
    Shang, Pan
    Xu, Qiuyun
    Chen, Bingzhen
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 349 - 366
  • [39] SPARSITY AND LOW-RANK AMPLITUDE BASED BLIND SOURCE SEPARATION
    Feng, Fangchen
    Kowalski, Matthieu
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 571 - 575
  • [40] Learning Low-Rank Structured Sparsity in Recurrent Neural Networks
    Wen, Weijing
    Yang, Fan
    Su, Yangfeng
    Zhou, Dian
    Zeng, Xuan
    2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2020,