Progressive Training Technique with Weak-Label Boosting for Fine-Grained Classification on Unbalanced Training Data

被引:0
|
作者
Jin, Yuhui [1 ]
Wang, Zuyun [2 ]
Liao, Huimin [2 ]
Zhu, Sainan [3 ]
Tong, Bin [3 ]
Yin, Yu [4 ]
Huang, Jian [1 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing 100191, Peoples R China
[2] Beijing Transportat Comprehens Law Enforcement Co, Beijing 100044, Peoples R China
[3] China Inst Geoenvironm Monitoring, Beijing 100081, Peoples R China
[4] Peking Univ, Affiliated High Sch, Beijing 102218, Peoples R China
关键词
unbalanced training data; progressive training; weak-label boosting; instance-aware hard ID mining strategy; feature-mapping loss;
D O I
10.3390/electronics11111684
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In practical classification tasks, the sample distribution of the dataset is often unbalanced; for example, this is the case in a dataset that contains a massive quantity of samples with weak labels and for which concrete identification is unavailable. Even in samples with exact labels, the number of samples corresponding to many labels is small, resulting in difficulties in learning the concepts through a small number of labeled samples. In addition, there is always a small interclass variance and a large intraclass variance among categories. Weak labels, few-shot problems, and fine-grained analysis are the key challenges affecting the performance of the classification model. In this paper, we develop a progressive training technique to address the few-shot challenge, along with a weak-label boosting method, by considering all of the weak IDs as negative samples of every predefined ID in order to take full advantage of the more numerous weak-label data. We introduce an instance-aware hard ID mining strategy in the classification loss and then further develop the global and local feature-mapping loss to expand the decision margin. We entered the proposed method into the Kaggle competition, which aims to build an algorithm to identify individual humpback whales in images. With a few other common training tricks, the proposed approach won first place in the competition. All three problems (weak labels, few-shot problems, and fine-grained analysis) exist in the dataset used in the competition. Additionally, we applied our method to CUB-2011 and Cars-196, which are the most widely-used datasets for fine-grained visual categorization tasks, and achieved respective accuracies of 90.1% and 94.9%. This experiment shows that the proposed method achieves the optimal effect compared with other common baselines, and verifies the effectiveness of our method. Our solution has been made available as an open source project.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] Fine-grained label learning in object detection with weak supervision of captions
    Xue Wang
    Youtian Du
    Suzan Verberne
    Fons J. Verbeek
    Multimedia Tools and Applications, 2023, 82 : 6557 - 6579
  • [22] Data Enrichment in Fine-Grained Classification of Aquatic Macroinvertebrates
    Raitoharju, Jenni
    Riabchenko, Ekaterina
    Meissner, Kristian
    Ahmad, Iftikhar
    Iosifidis, Alexandros
    Gabbouj, Moncef
    Kiranyaz, Serkan
    2016 ICPR 2ND WORKSHOP ON COMPUTER VISION FOR ANALYSIS OF UNDERWATER IMAGERY (CVAUI 2016), 2016, : 43 - 48
  • [23] Holistic fine-grained global glomerulosclerosis characterization: from detection to unbalanced classification
    Lu, Yuzhe
    Yang, Haichun
    Asad, Zuhayr
    Zhu, Zheyu
    Yao, Tianyuan
    Xu, Jiachen
    Fogo, Agnes B.
    Huo, Yuankai
    JOURNAL OF MEDICAL IMAGING, 2022, 9 (01)
  • [24] Fine-Grained Exploitation of Mixed Precision for Faster CNN Training
    Johnston, Travis
    Young, Steven R.
    Schuman, Catherine D.
    Chae, Junghoon
    March, Don D.
    Patton, Robert M.
    Potok, Thomas E.
    PROCEEDINGS OF 2019 5TH IEEE/ACM WORKSHOP ON MACHINE LEARNING IN HIGH PERFORMANCE COMPUTING ENVIRONMENTS (MLHPC 2019), 2019, : 9 - 18
  • [25] Label-Aware Hyperbolic Embeddings for Fine-grained Emotion Classification
    Chen, Chih-Yao
    Hung, Tun-Min
    Hsu, Yi-Li
    Ku, Lun-Wei
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 10947 - 10958
  • [26] Progressive Co-Attention Network for Fine-Grained Visual Classification
    Zhang, Tian
    Chang, Dongliang
    Ma, Zhanyu
    Guo, Jun
    2021 INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (VCIP), 2021,
  • [27] Progressive Erasing Network with consistency loss for fine-grained visual classification
    Peng, Jin
    Wang, Yongxiong
    Zhou, Zeping
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2022, 87
  • [28] Weak-shot Fine-grained Classification via Similarity Transfer
    Chen, Junjie
    Niu, Li
    Liu, Liu
    Zhang, Liqing
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [29] Multiscale Progressive Complementary Fusion Network for Fine-Grained Visual Classification
    Lei, Jingsheng
    Yang, Xinqi
    Yang, Shengying
    IEEE ACCESS, 2022, 10 : 62800 - 62810
  • [30] Web-Supervised Network with Softly Update-Drop Training for Fine-Grained Visual Classification
    Zhang, Chuanyi
    Yao, Yazhou
    Liu, Huafeng
    Xie, Guo-Sen
    Shu, Xiangbo
    Zhou, Tianfei
    Zhang, Zheng
    Shen, Fumin
    Tang, Zhenmin
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 12781 - 12788