Class relationship-based knowledge distillation for efficient human parsing

被引:1
|
作者
Lang, Yuqi [1 ]
Liu, Kunliang [1 ,2 ]
Wang, Jianming [2 ]
Hwang, Wonjun [1 ]
机构
[1] Ajou Univ, Dept AI, Suwon, Gyeonggi Do, South Korea
[2] Tiangong Univ, Tianjin, Peoples R China
关键词
artificial intelligence; computer vision; image recognition; neural nets;
D O I
10.1049/ell2.12900
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In computer vision, human parsing is challenging due to its demand for accurate human region location and semantic partitioning. This dense prediction task needs powerful computation and high-precision models. To enable real-time parsing on resource-limited devices, the authors introduced a lightweight model using ResNet18 as a core network . The authors simplified the pyramid module, improving context clarity and reducing complexity. The authors integrated a spatial attention fusion strategy to counter precision loss in the light-weighting process. Traditional models, despite their segmentation precision, are limited by their computational complexity and extensive parameters. The authors implemented knowledge distillation (KD) techniques to enhance the authors' lightweight network's accuracy. Traditional methods can fail to learn useful knowledge with significant network differences. Hence, the authors used a novel distillation approach based on inter-class and intra-class relations in prediction outcomes, noticeably improving parsing accuracy. The authors' experiments on the Look into Person (LIP) dataset show that their lightweight model significantly reduces parameters while maintaining parsing precision and enhancing inference speed.
引用
收藏
页数:3
相关论文
共 50 条
  • [31] Relationship-Based Business Process Crowdsourcing?
    O'Neill, Jacki
    Martin, David
    HUMAN-COMPUTER INTERACTION - INTERACT 2013, PT IV, 2013, 8120 : 429 - 446
  • [32] Economic Development and Relationship-Based Financing
    Giannetti, Mariassunta
    Yu, Xiaoyun
    REVIEW OF CORPORATE FINANCE STUDIES, 2015, 4 (01): : 69 - 107
  • [33] Relationship-Based Social Work with Adults
    Sarbu, Raluca
    BRITISH JOURNAL OF SOCIAL WORK, 2022, 52 (04): : 2422 - 2423
  • [34] Knowledge fusion distillation and gradient-based data distillation for class-incremental learning
    Xiong, Lin
    Guan, Xin
    Xiong, Hailing
    Zhu, Kangwen
    Zhang, Fuqing
    NEUROCOMPUTING, 2025, 622
  • [35] A Relationship-Based Approach to Model Management
    Chechik, Marsha
    MOMPES: 2009 ICSE WORKSHOP ON MODEL-BASED METHODOLOGIES FOR PERVASIVE AND EMBEDDED SOFTWARE, 2009, : 1 - 1
  • [36] Relationship-based social work with adults
    Stephens, Rebecca
    SOCIAL WORK EDUCATION, 2020, 39 (07) : 972 - 974
  • [37] Living and breathing relationship-based care
    Rowen, Lisa
    BARIATRIC NURSING AND SURGICAL PATIENT CARE, 2007, 2 (03): : 159 - 160
  • [38] Being with the patient: relationship-based data
    Harper, Gordon
    EUROPEAN CHILD & ADOLESCENT PSYCHIATRY, 2015, 24 : S31 - S31
  • [39] Inter-Class Correlation-Based Online Knowledge Distillation
    Zhu, Hongfang
    Gou, Jianping
    Du, Lan
    Ou, Weihua
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT 1, 2025, 15031 : 195 - 208
  • [40] An Efficient RGB-D Indoor Scene-Parsing Solution via Lightweight Multiflow Intersection and Knowledge Distillation
    Zhou, Wujie
    Zhang, Yuming
    Yan, Weiqing
    Ye, Lv
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2024, 18 (03) : 336 - 345