On varimax asymptotics in network models and spectral methods for dimensionality reduction

被引:0
|
作者
Cape, J. [1 ]
机构
[1] Univ Wisconsin, Dept Stat, 1300 Univ Ave, Madison, WI 53706 USA
基金
美国国家科学基金会;
关键词
Data embedding; Factor analysis; Latent variable; Network; Random graph; Varimax rotation; EIGENVECTORS;
D O I
10.1093/biomet/asad061
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Varimax factor rotations, while popular among practitioners in psychology and statistics since being introduced by , have historically been viewed with skepticism and suspicion by some theoreticians and mathematical statisticians. Now, work by provides new, fundamental insight: varimax rotations provably perform statistical estimation in certain classes of latent variable models when paired with spectral-based matrix truncations for dimensionality reduction. We build on this new-found understanding of varimax rotations by developing further connections to network analysis and spectral methods rooted in entrywise matrix perturbation analysis. Concretely, this paper establishes the asymptotic multivariate normality of vectors in varimax-transformed Euclidean point clouds that represent low-dimensional node embeddings in certain latent space random graph models. We address related concepts including network sparsity, data denoising and the role of matrix rank in latent variable parameterizations. Collectively, these findings, at the confluence of classical and contemporary multivariate analysis, reinforce methodology and inference procedures grounded in matrix factorization-based techniques. Numerical examples illustrate our findings and supplement our discussion.
引用
收藏
页码:609 / 623
页数:16
相关论文
共 50 条
  • [41] Statistical learning methods including dimensionality reduction
    Bock, Hans-Hermann
    Vichi, Maurizio
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 52 (01) : 370 - 373
  • [42] Dimensionality Reduction Methods: The Comparison of Speed and Accuracy
    Zubova, Jelena
    Kurasova, Olga
    Liutvinavicius, Marius
    INFORMATION TECHNOLOGY AND CONTROL, 2018, 47 (01): : 151 - 160
  • [43] DIMENSIONALITY REDUCTION METHODS FOR HMM PHONETIC RECOGNITION
    Hu, Hongbing
    Zahorian, Stephen A.
    2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 4854 - 4857
  • [44] Nonlinear dimensionality reduction using circuit models
    Andersson, F
    Nilsson, J
    IMAGE ANALYSIS, PROCEEDINGS, 2005, 3540 : 950 - 959
  • [45] Model Reduction of Neural Network Trees Based on Dimensionality Reduction
    Hayashi, Hirotomo
    Zhao, Qiangfu
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 1119 - 1124
  • [46] A Folded Neural Network Autoencoder for Dimensionality Reduction
    Wang, Jing
    He, Haibo
    Prokhorov, Danil V.
    PROCEEDINGS OF THE INTERNATIONAL NEURAL NETWORK SOCIETY WINTER CONFERENCE (INNS-WC2012), 2012, 13 : 120 - 127
  • [47] Fast Methods for Reducing Dimensionality of Spectral Data for Their Visualization
    V. A. Vagin
    A. E. Krasnov
    D. N. Nicol’skii
    Journal of Applied Spectroscopy, 2019, 86 : 101 - 105
  • [48] FAST METHODS FOR REDUCING DIMENSIONALITY OF SPECTRAL DATA FOR THEIR VISUALIZATION
    Vagin, V. A.
    Krasnov, A. E.
    Nicol'skii, D. N.
    JOURNAL OF APPLIED SPECTROSCOPY, 2019, 86 (01) : 101 - 105
  • [49] THE DIMENSIONALITY OF THE ALIASING PROBLEM IN MODELS WITH RATIONAL SPECTRAL DENSITIES
    HANSEN, LP
    SARGENT, TJ
    ECONOMETRICA, 1983, 51 (02) : 377 - 387
  • [50] The Windowing Algorithm: Dimensionality Reduction in Grey-Box System Identification of Reservoir Network Models
    de Holanda, Rafael Wanderley
    Gildin, Eduardo
    INTERNATIONAL CONFERENCE ON ELECTRICAL, COMPUTER AND ENERGY TECHNOLOGIES (ICECET 2021), 2021, : 1123 - 1128