Predictive Learning on Hidden Tree-Structured Ising Models

被引:0
|
作者
Nikolakakis, Konstantinos E. [1 ]
Kalogerias, Dionysios S. [2 ]
Sarwate, Anand D. [1 ]
机构
[1] Rutgers State Univ, Dept Elect & Comp Engn, 94 Brett Rd, Piscataway, NJ 08854 USA
[2] Michigan State Univ, Dept Elect & Comp Engn, 428 S Shaw Lane, E Lansing, MI 48824 USA
关键词
Ising Model; Chow-Liu Algorithm; Structure Learning; Predictive Learning; Distribution Estimation; Noisy Data; Hidden Markov Random Fields; MARKOV RANDOM-FIELD; LIKELIHOOD-ESTIMATION; PARTICLE FILTERS; GRAPHICAL MODEL; SELECTION; DISTRIBUTIONS; CONSISTENCY; TUTORIAL;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We provide high-probability sample complexity guarantees for exact structure recovery and accurate predictive learning using noise-corrupted samples from an acyclic (tree-shaped) graphical model. The hidden variables follow a tree-structured Ising model distribution, whereas the observable variables are generated by a binary symmetric channel taking the hidden variables as its input (flipping each bit independently with some constant probability q is an element of [0, 1/2)). In the absence of noise, predictive learning on Ising models was recently studied by Bresler and Karzand (2020); this paper quantifies how noise in the hidden model impacts the tasks of structure recovery and marginal distribution estimation by proving upper and lower bounds on the sample complexity. Our results generalize state-of-the-art bounds reported in prior work, and they exactly recover the noiseless case (q = 0). In fact, for any tree with p vertices and probability of incorrect recovery delta > 0, the sufficient number of samples remains logarithmic as in the noiseless case, i.e., O(log(p/(delta)), while the dependence on q is O(1/(1 - 2q)(4)), for both aforementioned tasks. We also present a new equivalent of Isserlis' Theorem for sign-valued tree-structured distributions, yielding a new low-complexity algorithm for higher-order moment estimation.
引用
收藏
页数:82
相关论文
共 50 条
  • [31] Ant Colony and Surrogate Tree-Structured Models for Orderings-Based Bayesian Network Learning
    Alonso-Barba, Juan I.
    de la Ossa, Luis
    Regnier-Coudert, Olivier
    McCall, John
    Gamez, Jose A.
    Puerta, Jose M.
    GECCO'15: PROCEEDINGS OF THE 2015 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2015, : 543 - 550
  • [32] TREE-STRUCTURED SURVIVAL ANALYSIS
    GORDON, L
    OLSHEN, RA
    CANCER TREATMENT REPORTS, 1985, 69 (10): : 1065 - 1069
  • [33] Tree-structured Haar transforms
    Egiazarian, K
    Astola, J
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2002, 16 (03) : 269 - 279
  • [34] Clustering of Tree-structured Data
    Lu, Na
    Wu, Yidan
    2015 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION, 2015, : 1210 - 1215
  • [35] Tree-structured neural decoding
    d'Avignon, C
    Geman, D
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 4 (04) : 743 - 754
  • [36] Tree-structured Bayesian network learning with application to scene classification
    Wang, Z. F.
    Wang, Z. H.
    Xie, W. J.
    ELECTRONICS LETTERS, 2011, 47 (09) : 540 - 541
  • [37] CRYPTANALYSIS OF TREE-STRUCTURED CIPHERS
    MILLAN, W
    DAWSON, EP
    OCONNOR, LJ
    ELECTRONICS LETTERS, 1994, 30 (12) : 941 - 942
  • [38] Tree-Structured Nonlinear Regression
    Chang, Youngjae
    Kim, Hyeonsoo
    KOREAN JOURNAL OF APPLIED STATISTICS, 2011, 24 (05) : 759 - 768
  • [39] Tree2Vector: Learning a Vectorial Representation for Tree-Structured Data
    Zhang, Haijun
    Wang, Shuang
    Xu, Xiaofei
    Chow, Tommy W. S.
    Wu, Q. M. Jonathan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) : 5304 - 5318
  • [40] Competitive tree-structured dictionaries
    Goodrich, MT
    PROCEEDINGS OF THE ELEVENTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2000, : 494 - 495