Incorporating Prior Knowledge when Learning Mixtures of Truncated Basis Functions from Data

被引:1
|
作者
Fernandez, Antonio [1 ]
Perez-Bernabe, Inmaculada [1 ]
Rumi, Rafael [1 ]
Salmeron, Antonio [1 ]
机构
[1] Univ Almeria, Dept Math, E-04120 Almeria, Spain
关键词
mixtures of truncated basis functions; prior information; learning; hybrid Bayesian networks; INFERENCE;
D O I
10.3233/978-1-61499-330-8-95
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixtures of truncated basis functions (MoTBFs) have been recently proposed as a generalisation of mixtures of truncated exponentials and mixtures of polynomials for modelling univariate and conditional distributions in hybrid Bayesian networks. In this paper we analyse the problem of incorporating prior knowledge when learning univariate MoTBFs. We consider scenarios where the prior knowledge is expressed as an MoTBF that is combined with another MoTBF density estimated from the available data. An important property, from the point of view of inference in hybrid Bayesian networks, is that the density yielded after the combination is again an MoTBF. We show the performance of the proposed method in a series of experiments with simulated data. The experiments suggest that the incorporation of prior knowledge improves the estimations, especially with scarce data.
引用
收藏
页码:95 / 104
页数:10
相关论文
共 50 条
  • [1] Learning mixtures of truncated basis functions from data
    Salmerón, A. (antonio.salmeron@ual.es), 1600, Elsevier Inc. (55):
  • [2] Learning mixtures of truncated basis functions from data
    Langseth, Helge
    Nielsen, Thomas D.
    Perez-Bernabe, Inmaculada
    Salmeron, Antonio
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2014, 55 (04) : 940 - 956
  • [3] Mixtures of truncated basis functions
    Langseth, Helge
    Nielsen, Thomas D.
    Rumi, Rafael
    Salmeron, Antonio
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2012, 53 (02) : 212 - 227
  • [4] Learning Conditional Distributions Using Mixtures of Truncated Basis Functions
    Perez-Bernabe, Inmaculada
    Salmeron, Antonio
    Langseth, Helge
    SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING WITH UNCERTAINTY, ECSQARU 2015, 2015, 9161 : 397 - 406
  • [5] Incorporating prior knowledge into learning by dividing training data
    Lu, Baoliang
    Wang, Xiaolin
    Utiyama, Masao
    FRONTIERS OF COMPUTER SCIENCE IN CHINA, 2009, 3 (01): : 109 - 122
  • [6] Incorporating prior knowledge into learning by dividing training data
    Baoliang Lu
    Xiaolin Wang
    Masao Utiyama
    Frontiers of Computer Science in China, 2009, 3 : 109 - 122
  • [7] Incorporating Prior Expert Knowledge In Learning Bayesian Networks From Genetic Epidemiological Data
    Su, Chengwei
    Borsuk, Mark E.
    Andrew, Angeline
    Karagas, Margaret
    2014 IEEE CONFERENCE ON COMPUTATIONAL INTELLIGENCE IN BIOINFORMATICS AND COMPUTATIONAL BIOLOGY, 2014,
  • [8] A Reparameterization of Mixtures of Truncated Basis Functions and its Applications
    Salmeron, Antonio
    Langseth, Helge
    Masegosa, Andres
    Nielsen, Thomas D.
    INTERNATIONAL CONFERENCE ON PROBABILISTIC GRAPHICAL MODELS, VOL 186, 2022, 186
  • [9] MoTBFs: An R Package for Learning Hybrid Bayesian Networks Using Mixtures of Truncated Basis Functions
    Perez-Bernabe, Inmaculada
    Maldonado, Ana D.
    Salmeron, Antonio
    Nielsen, Thomas D.
    R JOURNAL, 2020, 12 (02): : 343 - 359
  • [10] Incorporating prior knowledge from counterfactuals into knowledge graph reasoning
    Wang, Zikang
    Li, Linjing
    Zeng, Daniel
    Wu, Xiaofei
    KNOWLEDGE-BASED SYSTEMS, 2021, 223