Incorporating Prior Knowledge when Learning Mixtures of Truncated Basis Functions from Data

被引:1
|
作者
Fernandez, Antonio [1 ]
Perez-Bernabe, Inmaculada [1 ]
Rumi, Rafael [1 ]
Salmeron, Antonio [1 ]
机构
[1] Univ Almeria, Dept Math, E-04120 Almeria, Spain
来源
TWELFTH SCANDINAVIAN CONFERENCE ON ARTIFICIAL INTELLIGENCE (SCAI 2013) | 2013年 / 257卷
关键词
mixtures of truncated basis functions; prior information; learning; hybrid Bayesian networks; INFERENCE;
D O I
10.3233/978-1-61499-330-8-95
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixtures of truncated basis functions (MoTBFs) have been recently proposed as a generalisation of mixtures of truncated exponentials and mixtures of polynomials for modelling univariate and conditional distributions in hybrid Bayesian networks. In this paper we analyse the problem of incorporating prior knowledge when learning univariate MoTBFs. We consider scenarios where the prior knowledge is expressed as an MoTBF that is combined with another MoTBF density estimated from the available data. An important property, from the point of view of inference in hybrid Bayesian networks, is that the density yielded after the combination is again an MoTBF. We show the performance of the proposed method in a series of experiments with simulated data. The experiments suggest that the incorporation of prior knowledge improves the estimations, especially with scarce data.
引用
收藏
页码:95 / 104
页数:10
相关论文
共 50 条
  • [21] Learning credible DNNs via incorporating prior knowledge and model local explanation
    Mengnan Du
    Ninghao Liu
    Fan Yang
    Xia Hu
    Knowledge and Information Systems, 2021, 63 : 305 - 332
  • [22] Learning credible DNNs via incorporating prior knowledge and model local explanation
    Du, Mengnan
    Liu, Ninghao
    Yang, Fan
    Hu, Xia
    KNOWLEDGE AND INFORMATION SYSTEMS, 2021, 63 (02) : 305 - 332
  • [23] VQSVM: A case study for incorporating prior domain knowledge into inductive machine learning
    Yu, Ting
    Simoff, Simeon
    Jan, Tony
    NEUROCOMPUTING, 2010, 73 (13-15) : 2614 - 2623
  • [24] Incorporating Prior Scientific Knowledge Into Deep Learning for Precipitation Nowcasting on Radar Images
    Danpoonkij, Pattarapong
    Kleawsirikul, Nutnaree
    Leepaisomboon, Patamawadee
    Gaviphatt, Natnapat
    Sakaino, Hidetomo
    Vateekul, Peerapon
    2021 18TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER SCIENCE AND SOFTWARE ENGINEERING (JCSSE-2021), 2021,
  • [25] Training for Coherence Formation When Learning From Text and Picture and the Interplay With Learners' Prior Knowledge
    Seufert, Tina
    FRONTIERS IN PSYCHOLOGY, 2019, 10
  • [26] Incorporating Prior Knowledge in Local Differentially Private Data Collection for Frequency Estimation
    Chen, Xue
    Wang, Cheng
    Cui, Jipeng
    Yang, Qing
    Hu, Teng
    Jiang, Changjun
    IEEE TRANSACTIONS ON BIG DATA, 2023, 9 (02) : 499 - 511
  • [27] Incorporating biological prior knowledge for Bayesian learning via maximal knowledge-driven information priors
    Boluki, Shahin
    Esfahani, Mohammad Shahrokh
    Qian, Xiaoning
    Dougherty, Edward R.
    BMC BIOINFORMATICS, 2017, 18
  • [28] Incorporating biological prior knowledge for Bayesian learning via maximal knowledge-driven information priors
    Shahin Boluki
    Mohammad Shahrokh Esfahani
    Xiaoning Qian
    Edward R Dougherty
    BMC Bioinformatics, 18
  • [29] A graphical approach for incorporating prior knowledge when determining a sample size for the assessment of batched products
    Woodward, P
    Branson, J
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES D-THE STATISTICIAN, 2001, 50 : 417 - 426
  • [30] A Method for Integrating Expert Knowledge When Learning Bayesian Networks From Data
    Cano, Andres
    Masegosa, Andres R.
    Moral, Serafin
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2011, 41 (05): : 1382 - 1394