An Algebraic Approach to Combining Classifiers

被引:4
|
作者
Giabbanelli, Philippe J. [1 ]
Peters, Joseph G. [2 ]
机构
[1] Univ Cambridge, Cambridge, England
[2] Simon Fraser Univ, Burnaby, BC V5A 1S6, Canada
关键词
Model combination; Non-stationary distributions; Unsupervised meta-learning;
D O I
10.1016/j.procs.2015.05.346
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In distributed classification, each learner observes its environment and deduces a classifier. As a learner has only a local view of its environment, classifiers can be exchanged among the learners and integrated, or merged, to improve accuracy. However, the operation of merging is not defined for most classifiers. Furthermore, the classifiers that have to be merged may be of different types in settings such as ad-hoc networks in which several generations of sensors may be creating classifiers. We introduce decision spaces as a framework for merging possibly different classifiers. We formally study the merging operation as an algebra, and prove that it satisfies a desirable set of properties. The impact of time is discussed for the two main data mining settings. Firstly, decision spaces can naturally be used with non-stationary distributions, such as the data collected by sensor networks, as the impact of a model decays over time. Secondly, we introduce an approach for stationary distributions, such as homogeneous databases partitioned over different learners, which ensures that all models have the same impact. We also present a method using storage flexibly to achieve different types of decay for non-stationary distributions.
引用
收藏
页码:1545 / 1554
页数:10
相关论文
共 50 条
  • [21] Is independence good for combining classifiers?
    Kuncheva, LI
    Whitaker, CJ
    Shipp, CA
    Duin, RPW
    15TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 2, PROCEEDINGS: PATTERN RECOGNITION AND NEURAL NETWORKS, 2000, : 168 - 171
  • [22] Combining classifiers via discretization
    Mojirsheibani, M
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1999, 94 (446) : 600 - 609
  • [23] Data dependence in combining classifiers
    Kamel, MS
    Wanas, NM
    MULTIPLE CLASSIFIER SYSTEMS, PROCEEDING, 2003, 2709 : 1 - 14
  • [24] Combining Classifiers in a Tree Structure
    Woloszynski, Tomasz
    Kurzynski, Marek
    2008 INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE FOR MODELLING CONTROL & AUTOMATION, VOLS 1 AND 2, 2008, : 785 - 790
  • [25] Eigenclassifiers for combining correlated classifiers
    Ulas, Aydin
    Yildiz, Olcay Taner
    Alpaydin, Ethem
    INFORMATION SCIENCES, 2012, 187 : 109 - 120
  • [26] Combining classifiers for face recognition
    Lu, XG
    Wang, YH
    Jain, AK
    2003 INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOL III, PROCEEDINGS, 2003, : 13 - 16
  • [27] Combining classifiers: A theoretical framework
    Kittler, J
    PATTERN ANALYSIS AND APPLICATIONS, 1998, 1 (01) : 18 - 27
  • [28] Combining classifiers: A theoretical framework
    J. Kittler
    Pattern Analysis and Applications, 1998, 1 : 18 - 27
  • [29] On combining classifiers for speaker authentication
    Liñares, LR
    García-Mateo, C
    Alba-Castro, JL
    PATTERN RECOGNITION, 2003, 36 (02) : 347 - 359
  • [30] Combining classifiers with particle swarms
    Yang, LY
    Qin, Z
    ADVANCES IN NATURAL COMPUTATION, PT 2, PROCEEDINGS, 2005, 3611 : 756 - 763