Fast Dynamic Sampling for Determinantal Point Processes

被引:0
|
作者
Song, Zhao [1 ]
Yin, Junze [2 ]
Zhang, Lichen [3 ]
Zhang, Ruizhe [4 ]
机构
[1] Adobe Res, Double Adobe, AZ 85617 USA
[2] Boston Univ, Boston, MA 02215 USA
[3] MIT, Cambridge, MA 02139 USA
[4] Univ Calif Berkeley, Berkeley, CA 94720 USA
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238 | 2024年 / 238卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we provide fast dynamic algorithms for repeatedly sampling from distributions characterized by Determinantal Point Processes (DPPs) and Nonsymmetric Determinantal Point Processes (NDPPs). DPPs are a very well-studied class of distributions on subsets of items drawn from a ground set of cardinality n characterized by a symmetric n x n kernel matrix L such that the probability of any subset is proportional to the determinant of its corresponding principal submatrix. Recent work has shown that the kernel symmetry constraint can be relaxed, leading to NDPPs, which can better model data in several machine learning applications. Given a low-rank kernel matrix L = L+ L-inverted perpendicular is an element of R-n x n and its corresponding eigendecomposition specified by {lambda(i), u(i)} (d)(i=1) where d <= n is the rank, we design a data structure that uses O(nd) space and preprocesses data in O(nd(omega - 1)) time where omega approximate to 2.37 is the exponent of matrix multiplication. The data structure can generate a sample according to DPP distribution in time O(|E|(3) log n + |E|(omega - 1) d(2)) or according to NDPP distribution in time O((|E|(3) log n + |E|(omega - 1)d(2)) (1 + w)(d)) for E being the sampled indices and w is a data-dependent parameter. This improves upon the space and preprocessing time over prior works, and achieves a state-of-the-art sampling time when the sampling set is relatively dense. At the heart of our data structure is an efficient sampling tree that can leverage batch initialization and fast inner product query simultaneously.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] Learning Determinantal Point Processes with Moments and Cycles
    Urschel, John
    Brunel, Victor-Emmanuel
    Moitra, Ankur
    Rigollet, Philippe
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [42] Functional summary statistics for point processes on the sphere with an application to determinantal point processes
    Moller, Jesper
    Rubak, Ege
    SPATIAL STATISTICS, 2016, 18 : 4 - 23
  • [43] Partial isometries, duality, and determinantal point processes
    Katori, Makoto
    Shirai, Tomoyuki
    RANDOM MATRICES-THEORY AND APPLICATIONS, 2022, 11 (03)
  • [44] QUASI-SYMMETRIES OF DETERMINANTAL POINT PROCESSES
    Bufetov, Alexander I.
    ANNALS OF PROBABILITY, 2018, 46 (02): : 956 - 1003
  • [45] Conditional intensity and Gibbsianness of determinantal point processes
    Georgii, HO
    Yoo, HJ
    JOURNAL OF STATISTICAL PHYSICS, 2005, 118 (1-2) : 55 - 84
  • [46] Average characteristic polynomials of determinantal point processes
    Hardy, Adrien
    ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES, 2015, 51 (01): : 283 - 303
  • [47] Determinantal Point Processes and Fermion Quasifree States
    Olshanski, Grigori
    COMMUNICATIONS IN MATHEMATICAL PHYSICS, 2020, 378 (01) : 507 - 555
  • [48] On the mean projection theorem for determinantal point processes
    Kassel, Adrien
    Levy, Thierry
    ALEA-LATIN AMERICAN JOURNAL OF PROBABILITY AND MATHEMATICAL STATISTICS, 2023, 20 : 497 - 504
  • [49] Tweet Timeline Generation with Determinantal Point Processes
    Yao, Jin-ge
    Fan, Feifan
    Zhao, Wayne Xin
    Wan, Xiaojun
    Chang, Edward
    Xiao, Jianguo
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 3080 - 3086
  • [50] Online MAP Inference of Determinantal Point Processes
    Bhaskara, Aditya
    Karbasi, Amin
    Lattanzi, Silvio
    Zadimoghaddam, Morteza
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33