Connectome Smoothing via Low-Rank Approximations

被引:20
|
作者
Tang, Runze [1 ]
Ketcha, Michael [2 ]
Badea, Alexandra [3 ,4 ]
Calabrese, Evan D. [3 ,4 ]
Margulies, Daniel S. [5 ]
Vogelstein, Joshua T. [2 ]
Priebe, Carey E. [1 ]
Sussman, Daniel L. [6 ]
机构
[1] Johns Hopkins Univ, Dept Appl Math & Stat, Baltimore, MD 21218 USA
[2] Johns Hopkins Univ, Dept Biomed Engn, Baltimore, MD 21218 USA
[3] Duke Univ, Dept Radiol, Durham, NC 27708 USA
[4] Duke Univ, Dept Biomed Engn, Durham, NC 27708 USA
[5] Max Planck Inst Human Cognit & Brain Sci, Max Planck Res Grp Neuroanat & Connect, D-04103 Leipzig, Germany
[6] Boston Univ, Dept Math & Stat, Boston, MA 02215 USA
关键词
Networks; connectome; low-rank; estimation; CEREBRAL-CORTEX; MATRIX; DIMENSIONALITY; NETWORK; GRAPHS; BRAIN;
D O I
10.1109/TMI.2018.2885968
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In brain imaging and connectomics, the study of brain networks, estimating the mean of a population of graphs based on a sample is a core problem. Often, this problem is especially difficult because the sample or cohort size is relatively small, sometimes even a single subject, while the number of nodes can be very large with noisy estimates of connectivity. While the element-wise sample mean of the adjacency matrices is a common approach, this method does not exploit the underlying structural properties of the graphs. We propose using a low-rank method that incorporates dimension selection and diagonal augmentation to smooth the estimates and improve performance over the naive methodology for small sample sizes. Theoretical results for the stochastic block model show that this method offers major improvements when there are many vertices. Similarly, we demonstrate that the low-rank methods outperform the standard sample mean for a variety of independent edge distributions as well as human connectome data derived from the magnetic resonance imaging, especially when the sample sizes are small. Moreover, the low-rank methods yield "eigen-connectomes," which correlate with the lobe-structure of the human brain and superstructures of the mouse brain. These results indicate that the low-rank methods are the important parts of the toolbox for researchers studying populations of graphs in general and statistical connectomics in particular.
引用
收藏
页码:1446 / 1456
页数:11
相关论文
共 50 条
  • [41] OPTIMAL LOW-RANK APPROXIMATIONS OF BAYESIAN LINEAR INVERSE PROBLEMS
    Spantini, Alessio
    Solonen, Antti
    Cui, Tiangang
    Martin, James
    Tenorio, Luis
    Marzouk, Youssef
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2015, 37 (06): : A2451 - A2487
  • [42] A NEW PRECONDITIONER THAT EXPLOITS LOW-RANK APPROXIMATIONS TO FACTORIZATION ERROR
    Higham, Nicholas J.
    Mary, Theo
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2019, 41 (01): : A59 - A82
  • [43] FREQUENCY-LIMITED BALANCED TRUNCATION WITH LOW-RANK APPROXIMATIONS
    Benner, Peter
    Kuerschner, Patrick
    Saak, Jens
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (01): : A471 - A499
  • [44] KOLMOGOROV WIDTHS AND LOW-RANK APPROXIMATIONS OF PARAMETRIC ELLIPTIC PDES
    Bachmayr, Markus
    Cohen, Albert
    MATHEMATICS OF COMPUTATION, 2017, 86 (304) : 701 - 724
  • [45] Image Completion with Filtered Low-Rank Tensor Train Approximations
    Zdunek, Rafal
    Fonal, Krzysztof
    Sadowski, Tomasz
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2019, PT II, 2019, 11507 : 235 - 245
  • [46] BEST LOW-RANK APPROXIMATIONS AND KOLMOGOROV n-WIDTHS
    Floater, Michael S.
    Manni, Carla
    Sande, Espen
    Speleers, Hendrik
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2021, 42 (01) : 330 - 350
  • [47] ALRA: Adaptive Low-Rank Approximations for Neural Network Pruning
    Sinha, Soumen
    Sinha, Rajen Kumar
    2024 IEEE 48TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC 2024, 2024, : 1636 - 1641
  • [48] How Good are Low-Rank Approximations in Gaussian Process Regression?
    Daskalakis, Constantinos
    Dellaportas, Petros
    Panos, Aristeidis
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6463 - 6470
  • [49] Singular Values of Dual Quaternion Matrices and Their Low-Rank Approximations
    Ling, Chen
    He, Hongjin
    Qi, Liqun
    NUMERICAL FUNCTIONAL ANALYSIS AND OPTIMIZATION, 2022, 43 (12) : 1423 - 1458
  • [50] Pass-efficient truncated UTV for low-rank approximations
    Ji, Ying
    Feng, Yuehua
    Dong, Yongxin
    COMPUTATIONAL & APPLIED MATHEMATICS, 2024, 43 (01):