Non-Negative Bregman Divergence Minimization for Deep Direct Density Ratio Estimation

被引:0
|
作者
Kato, Masahiro [1 ]
Teshima, Takeshi [2 ]
机构
[1] CyberAgent Inc, Tokyo, Japan
[2] Univ Tokyo, Tokyo, Japan
关键词
COVARIATE SHIFT; INFERENCE; MIXTURE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Density ratio estimation (DRE) is at the core of various machine learning tasks such as anomaly detection and domain adaptation. In existing studies on DRE, methods based on Bregman divergence (BD) minimization have been extensively studied. However, BD minimization when applied with highly flexible models, such as deep neural networks, tends to suffer from what we call train-loss hacking, which is a source of overfitting caused by a typical characteristic of empirical BD estimators. In this paper, to mitigate train-loss hacking, we propose a non-negative correction for empirical BD estimators. Theoretically, we confirm the soundness of the proposed method through a generalization error bound. Through our experiments, the proposed methods show a favorable performance in inlier-based outlier detection.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] A SPLIT BREGMAN METHOD FOR NON-NEGATIVE SPARSITY PENALIZED LEAST SQUARES WITH APPLICATIONS TO HYPERSPECTRAL DEMIXING
    Szlam, Arthur
    Guo, Zhaohui
    Osher, Stanley
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 1917 - 1920
  • [42] A Hybrid Algorithm for Non-negative Matrix Factorization Based on Symmetric Information Divergence
    Devarajan, Karthik
    Ebrahimi, Nader
    Soofi, Ehsan
    PROCEEDINGS 2015 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2015, : 1658 - 1664
  • [43] Logdet Divergence Based Sparse Non-negative Matrix Factorization for Stable Representation
    Liao, Qing
    Guan, Naiyang
    Zhang, Qian
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 871 - 876
  • [45] Sparse Non-negative Matrix Factorization with Generalized Kullback-Leibler Divergence
    Chen, Jingwei
    Feng, Yong
    Liu, Yang
    Tang, Bing
    Wu, Wenyuan
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2016, 2016, 9937 : 353 - 360
  • [46] Divergence Based Non-Negative Matrix Factorization for top-N Recommendations
    Haque, Md. Enamul
    Zobaed, S. M.
    Tozal, Mehmet Engin
    Raghavan, Vijay
    PROCEEDINGS OF THE 52ND ANNUAL HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES, 2019, : 450 - 459
  • [47] Convergent Projective Non-negative Matrix Factorization with Kullback-Leibler Divergence
    Hu, Lirui
    Dai, Liang
    Wu, Jianguo
    PATTERN RECOGNITION LETTERS, 2014, 36 : 15 - 21
  • [48] On the Existence of Vector Fields with Non-Negative Divergence in Rearrangement-Invariant Spaces
    Curca, Eduard
    INDIANA UNIVERSITY MATHEMATICS JOURNAL, 2020, 69 (01) : 119 - 136
  • [49] Topic Diffusion Discovery based on Deep Non-negative Autoencoder
    Huang, Sheng-Tai
    Kang, Yihuang
    Hung, Shao-Min
    Kuo, Bowen
    Cheng, I-Ling
    2020 IEEE 21ST INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE (IRI 2020), 2020, : 405 - 408
  • [50] Improved non-negative estimation of variance components for exposure assessment
    CHAVA PERETZ
    DAVID M STEINBERG
    Journal of Exposure Science & Environmental Epidemiology, 2001, 11 : 414 - 421