Fast Dynamic Sampling for Determinantal Point Processes

被引:0
|
作者
Song, Zhao [1 ]
Yin, Junze [2 ]
Zhang, Lichen [3 ]
Zhang, Ruizhe [4 ]
机构
[1] Adobe Res, Double Adobe, AZ 85617 USA
[2] Boston Univ, Boston, MA 02215 USA
[3] MIT, Cambridge, MA 02139 USA
[4] Univ Calif Berkeley, Berkeley, CA 94720 USA
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238 | 2024年 / 238卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we provide fast dynamic algorithms for repeatedly sampling from distributions characterized by Determinantal Point Processes (DPPs) and Nonsymmetric Determinantal Point Processes (NDPPs). DPPs are a very well-studied class of distributions on subsets of items drawn from a ground set of cardinality n characterized by a symmetric n x n kernel matrix L such that the probability of any subset is proportional to the determinant of its corresponding principal submatrix. Recent work has shown that the kernel symmetry constraint can be relaxed, leading to NDPPs, which can better model data in several machine learning applications. Given a low-rank kernel matrix L = L+ L-inverted perpendicular is an element of R-n x n and its corresponding eigendecomposition specified by {lambda(i), u(i)} (d)(i=1) where d <= n is the rank, we design a data structure that uses O(nd) space and preprocesses data in O(nd(omega - 1)) time where omega approximate to 2.37 is the exponent of matrix multiplication. The data structure can generate a sample according to DPP distribution in time O(|E|(3) log n + |E|(omega - 1) d(2)) or according to NDPP distribution in time O((|E|(3) log n + |E|(omega - 1)d(2)) (1 + w)(d)) for E being the sampled indices and w is a data-dependent parameter. This improves upon the space and preprocessing time over prior works, and achieves a state-of-the-art sampling time when the sampling set is relatively dense. At the heart of our data structure is an efficient sampling tree that can leverage batch initialization and fast inner product query simultaneously.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] A Polynomial Time MCMC Method for Sampling from Continuous Determinantal Point Processes
    Rezaei, Alireza
    Gharan, Shayan Oveis
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [22] Determinantal point processes for coresets
    Tremblay, Nicolas
    Barthelmé, Simon
    Amblard, Pierre-Olivier
    Journal of Machine Learning Research, 2019, 20
  • [23] Using The Matrix Ridge Approximation to Speedup Determinantal Point Processes Sampling Algorithms
    Wang, Shusen
    Zhang, Chao
    Qian, Hui
    Zhang, Zhihua
    PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 2121 - 2127
  • [24] Determinantal identity for multilevel ensembles and finite determinantal point processes
    J. Harnad
    A. Yu. Orlov
    Analysis and Mathematical Physics, 2012, 2 : 105 - 121
  • [25] Practical Nonisotropic Monte Carlo Sampling in High Dimensions via Determinantal Point Processes
    Choromanski, Krzysztof
    Pacchiano, Aldo
    Parker-Holder, Jack
    Tang, Yunhao
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1363 - 1373
  • [26] Determinantal identity for multilevel ensembles and finite determinantal point processes
    Harnad, J.
    Orlov, A. Yu.
    ANALYSIS AND MATHEMATICAL PHYSICS, 2012, 2 (02) : 105 - 121
  • [27] Quantifying repulsiveness of determinantal point processes
    Biscio, Christophe Ange Napoleon
    Lavancier, Frederic
    BERNOULLI, 2016, 22 (04) : 2001 - 2028
  • [28] Difference operators and determinantal point processes
    Olshanski, Grigori
    FUNCTIONAL ANALYSIS AND ITS APPLICATIONS, 2008, 42 (04) : 317 - 329
  • [29] Determinantal point processes in the flat limit
    Barthelme, Simon
    Tremblay, Nicolas
    Usevich, Konstantin
    Amblard, Pierre-Olivier
    BERNOULLI, 2023, 29 (02) : 957 - 983
  • [30] Learning Nonsymmetric Determinantal Point Processes
    Gartrell, Mike
    Brunel, Victor-Emmanuel
    Dohmatob, Elvis
    Krichene, Syrine
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32