Incorporating Prior Knowledge when Learning Mixtures of Truncated Basis Functions from Data

被引:1
|
作者
Fernandez, Antonio [1 ]
Perez-Bernabe, Inmaculada [1 ]
Rumi, Rafael [1 ]
Salmeron, Antonio [1 ]
机构
[1] Univ Almeria, Dept Math, E-04120 Almeria, Spain
来源
TWELFTH SCANDINAVIAN CONFERENCE ON ARTIFICIAL INTELLIGENCE (SCAI 2013) | 2013年 / 257卷
关键词
mixtures of truncated basis functions; prior information; learning; hybrid Bayesian networks; INFERENCE;
D O I
10.3233/978-1-61499-330-8-95
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixtures of truncated basis functions (MoTBFs) have been recently proposed as a generalisation of mixtures of truncated exponentials and mixtures of polynomials for modelling univariate and conditional distributions in hybrid Bayesian networks. In this paper we analyse the problem of incorporating prior knowledge when learning univariate MoTBFs. We consider scenarios where the prior knowledge is expressed as an MoTBF that is combined with another MoTBF density estimated from the available data. An important property, from the point of view of inference in hybrid Bayesian networks, is that the density yielded after the combination is again an MoTBF. We show the performance of the proposed method in a series of experiments with simulated data. The experiments suggest that the incorporation of prior knowledge improves the estimations, especially with scarce data.
引用
收藏
页码:95 / 104
页数:10
相关论文
共 50 条
  • [31] miXGENE tool for learning from heterogeneous gene expression data using prior knowledge
    Holec, Matej
    Gologuzov, Valentin
    Klema, Jiri
    2014 IEEE 27TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS (CBMS), 2014, : 247 - 250
  • [32] Incorporating prior knowledge from the new person into recognition of facial expression
    Amin Mohammadian
    Hassan Aghaeinia
    Farzad Towhidkhah
    Signal, Image and Video Processing, 2016, 10 : 235 - 242
  • [33] Incorporating prior knowledge from the new person into recognition of facial expression
    Mohammadian, Amin
    Aghaeinia, Hassan
    Towhidkhah, Farzad
    SIGNAL IMAGE AND VIDEO PROCESSING, 2016, 10 (02) : 235 - 242
  • [34] Incorporating prior knowledge into self-supervised representation learning for long PHM signal
    Wang, Yilin
    Li, Yuanxiang
    Zhang, Yuxuan
    Lei, Jia
    Yu, Yifei
    Zhang, Tongtong
    Yang, Yongshen
    Zhao, Honghua
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2024, 241
  • [35] Incorporating prior knowledge of gene functional groups into regularized discriminant analysis of microarray data
    Tai, Feng
    Pan, Wei
    BIOINFORMATICS, 2007, 23 (23) : 3170 - 3177
  • [36] Incorporating Causal Graphical Prior Knowledge into Predictive Modeling via Simple Data Augmentation
    Teshima, Takeshi
    Sugiyama, Masashi
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 161, 2021, 161 : 86 - 96
  • [37] Using prior knowledge to minimize interference when learning large amounts of information
    Kole, Jaws A.
    Healy, Alice F.
    MEMORY & COGNITION, 2007, 35 (01) : 124 - 137
  • [38] Using prior knowledge to minimize interference when learning large amounts of information
    James A. Kole
    Alice F. Healy
    Memory & Cognition, 2007, 35 : 124 - 137
  • [39] Do Gaming Experience and Prior Knowledge Matter When Learning with a Gamified ITS?
    Tahir, Faiza
    Mitrovic, Antonija
    Sotardi, Valerie
    IEEE 21ST INTERNATIONAL CONFERENCE ON ADVANCED LEARNING TECHNOLOGIES (ICALT 2021), 2021, : 75 - 77
  • [40] Incorporating prior knowledge with physics-informed neural networks to predict arterial input functions from dynamic PET images
    Ferrante, M.
    Inglese, M.
    Brusaferri, L.
    Whitehead, A. C.
    Loggia, M. L.
    Toschi, N.
    EUROPEAN JOURNAL OF NUCLEAR MEDICINE AND MOLECULAR IMAGING, 2023, 50 (SUPPL 1) : S376 - S377