Co-training Based Attribute Reduction for Partially Labeled Data

被引:1
|
作者
Zhang, Wei [1 ,2 ]
Miao, Duoqian [1 ,3 ]
Gao, Can [4 ]
Yue, Xiaodong [5 ]
机构
[1] Tongji Univ, Sch Elect & Informat Engn, Shanghai 201804, Peoples R China
[2] Shanghai Univ Elect Power, Sch Comp Sci & Technol, Shanghai PT-200090, Peoples R China
[3] Tongji Univ, Minist Educ, Key Lab Embedded Syst & Serv Comp, Shanghai PT-201804, Peoples R China
[4] Zoom lion Heavy Ind Sci & Technol Dev Co Ltd, Changsha PT-410013, Peoples R China
[5] Shanghai Univ, Sch Engn & Comp Sci, Shanghai PT-200444, Peoples R China
基金
中国国家自然科学基金;
关键词
Rough Sets; Co-training; Incremental Attribute Reduction; Partially Labeled Data; Semi-supervised learning;
D O I
10.1007/978-3-319-11740-9_8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Rough set theory is an effective supervised learning model for labeled data. However, it is often the case that practical problems involve both labeled and unlabeled data. In this paper, the problem of attribute reduction for partially labeled data is studied. A novel semi-supervised attribute reduction algorithm is proposed, based on co-training which capitalizes on the unlabeled data to improve the quality of attribute reducts from few labeled data. It gets two diverse reducts of the labeled data, employs them to train its base classifiers, then co-trains the two base classifiers iteratively. In every round, the base classifiers learn from each other on the unlabeled data and enlarge the labeled data, so better quality reducts could be computed from the enlarged labeled data and employed to construct base classifiers of higher performance. The experimental results with UCI data sets show that the proposed algorithm can improves the quality of reduct.
引用
收藏
页码:77 / 88
页数:12
相关论文
共 50 条
  • [1] Diverse reduct subspaces based co-training for partially labeled data
    Miao, Duoqian
    Gao, Can
    Zhang, Nan
    Zhang, Zhifei
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2011, 52 (08) : 1103 - 1117
  • [2] Three-way decision with co-training for partially labeled data
    Gao, Can
    Zhou, Jie
    Miao, Duoqian
    Wen, Jiajun
    Yue, Xiaodong
    INFORMATION SCIENCES, 2021, 544 : 500 - 518
  • [3] Attribute reduction for partially labeled data based on hypergraph models
    Xie, Xiaojun
    Qin, Xiaolin
    Huang, Guangmei
    Zhao, Wei
    2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 1434 - 1439
  • [4] Neighborhood attribute reduction approach to partially labeled data
    Keyu Liu
    Eric C. C. Tsang
    Jingjing Song
    Hualong Yu
    Xiangjian Chen
    Xibei Yang
    Granular Computing, 2020, 5 : 239 - 250
  • [5] Neighborhood attribute reduction approach to partially labeled data
    Liu, Keyu
    Tsang, Eric C. C.
    Song, Jingjing
    Yu, Hualong
    Chen, Xiangjian
    Yang, Xibei
    GRANULAR COMPUTING, 2020, 5 (02) : 239 - 250
  • [6] Authorship Attribution with Very Few Labeled Data: A Co-training Approach
    Fan, Mengdi
    Qian, Tieyun
    Chen, Li
    Liu, Bin
    Zhong, Ming
    He, Guoliang
    WEB-AGE INFORMATION MANAGEMENT, WAIM 2014, 2014, 8485 : 657 - 668
  • [7] Granular-conditional-entropy-based attribute reduction for partially labeled data with proxy labels
    Gao, Can
    Zhou, Jie
    Miao, Duoqian
    Yue, Xiaodong
    Wan, Jun
    INFORMATION SCIENCES, 2021, 580 : 111 - 128
  • [8] Semi-supervised attribute reduction for partially labeled categorical data based on predicted label
    Huang, Dan
    Zhang, Qinli
    Li, Zhaowen
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2023, 154 : 242 - 261
  • [9] A Test Cost Sensitive Heuristic Attribute Reduction Algorithm for Partially Labeled Data
    Hu, Shengdan
    Miao, Duoqian
    Zhang, Zhifei
    Luo, Sheng
    Zhang, Yuanjian
    Hu, Guirong
    ROUGH SETS, IJCRS 2018, 2018, 11103 : 257 - 269
  • [10] DCPE Co-Training: Co-Training Based on Diversity of Class Probability Estimation
    Xu, Jin
    He, Haibo
    Man, Hong
    2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010, 2010,