COMPARISON OF NMF WITH KULLBACK-LEIBLER DIVERGENCE AND ITAKURA-SAITO DIVERGENCE FOR ODOR APPROXIMATION

被引:0
|
作者
Prasetyawan, Dani [1 ]
Nakamoto, Takamichi [1 ]
机构
[1] Tokyo Inst Technol, Tokyo, Japan
关键词
Odor approximation; odor component; odor reproduction; NMF; essential oils;
D O I
暂无
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Odor approximation enables us to produce a variety odors by blending odor components. Approximated odor based on sensed odor is called odor reproduction. The reproduced odor should be as close to a target odor as possible. On the other hand, no primary odors have been found. Thus, finding an appropriate set of odor components with the minimum number to cover a wide range of odors is indispensable. Non-negative matrix factorization with Kullback-Leibler divergence (NMF-KL) and Itakura-Saito divergence (NMF-IS) were used to explore odor components using the mass spectrum data space of essential oils that we gathered using mass spectrometry (MS). The different property from both NMFs in treating small peaks were evaluated to explore odor components. The results shows that NMF-IS has a higher capability of odor reproduction compared with NMF-KL, especially in reproducing small peaks that appear in high m/z region.
引用
收藏
页码:310 / 312
页数:3
相关论文
共 50 条
  • [2] COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE
    Kameoka, Hirokazu
    Kagami, Hideaki
    Yukawa, Masahiro
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 56 - 60
  • [3] NMF Algorithm Based on Extended Kullback-Leibler Divergence
    Gao, Liuyang
    Tian, Yinghua
    Lv, Pinpin
    Dong, Peng
    PROCEEDINGS OF 2019 IEEE 3RD INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2019), 2019, : 1804 - 1808
  • [4] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [5] Robust Hypothesis Testing with the Itakura-Saito Divergence
    Zhou, Feng
    Song, Enbin
    Zhu, Yunmin
    2017 20TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2017, : 1561 - 1566
  • [6] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [7] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [8] Kullback-Leibler divergence and the Pareto-Exponential approximation
    Weinberg, G. V.
    SPRINGERPLUS, 2016, 5
  • [9] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [10] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124