Kernel Stein Discrepancy on Lie Groups: Theory and Applications

被引:0
|
作者
Qu, Xiaoda [1 ]
Fan, Xiran [2 ]
Vemuri, Baba C. [3 ]
机构
[1] Univ Florida, Dept Stat, Gainesville, FL 32611 USA
[2] Visa, San Francisco, CA 94128 USA
[3] Univ Florida, Dept CISE, Gainesville, FL 32611 USA
关键词
Stein's operator; Lie groups; Riemannian manifolds; kernel Stein discrepancy; exponential distribution; Riemannian normal distribution; FISHER DISTRIBUTION; MATRIX; STATISTICS; ROTATIONS;
D O I
10.1109/TIT.2024.3468212
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Distributional approximation is a fundamental problem in machine learning with numerous applications across all fields of science and engineering and beyond. The key challenge in most approximation methods is the need to tackle the intractable normalization constant present in the candidate distributions used to model the data. This intractability is especially common in distributions of manifold-valued random variables such as rotation matrices, orthogonal matrices etc. In this paper, we focus on the distributional approximation problem in Lie groups since they are frequently encountered in many applications including but not limited to, computer vision, robotics, medical imaging and many more. We present a novel Stein's operator on Lie groups leading to a kernel Stein discrepancy (KSD), which is a normalization-free loss function. We present several theoretical results characterizing the properties of this new KSD on Lie groups and its minimizer namely, the minimum KSD estimator (MKSDE). Properties of MKSDE are presented and proved, including strong consistency, CLT and a closed form of the MKSDE for the von Mises-Fisher and in general, the exponential family on SO(N). Finally, we present several experimental results depicting advantages of MKSDE over maximum likelihood estimation.
引用
收藏
页码:8961 / 8974
页数:14
相关论文
共 50 条