Class relationship-based knowledge distillation for efficient human parsing

被引:1
|
作者
Lang, Yuqi [1 ]
Liu, Kunliang [1 ,2 ]
Wang, Jianming [2 ]
Hwang, Wonjun [1 ]
机构
[1] Ajou Univ, Dept AI, Suwon, Gyeonggi Do, South Korea
[2] Tiangong Univ, Tianjin, Peoples R China
关键词
artificial intelligence; computer vision; image recognition; neural nets;
D O I
10.1049/ell2.12900
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In computer vision, human parsing is challenging due to its demand for accurate human region location and semantic partitioning. This dense prediction task needs powerful computation and high-precision models. To enable real-time parsing on resource-limited devices, the authors introduced a lightweight model using ResNet18 as a core network . The authors simplified the pyramid module, improving context clarity and reducing complexity. The authors integrated a spatial attention fusion strategy to counter precision loss in the light-weighting process. Traditional models, despite their segmentation precision, are limited by their computational complexity and extensive parameters. The authors implemented knowledge distillation (KD) techniques to enhance the authors' lightweight network's accuracy. Traditional methods can fail to learn useful knowledge with significant network differences. Hence, the authors used a novel distillation approach based on inter-class and intra-class relations in prediction outcomes, noticeably improving parsing accuracy. The authors' experiments on the Look into Person (LIP) dataset show that their lightweight model significantly reduces parameters while maintaining parsing precision and enhancing inference speed.
引用
收藏
页数:3
相关论文
共 50 条
  • [41] OPERATIONALIZING RELATIONSHIP-BASED CARE: HANDOFF AT THE BEDSIDE
    Brosnan, Patricia
    Bracken, Thomas
    George, Korkoh-jah
    Lynn, Joan
    Quashie, Wayne
    ONCOLOGY NURSING FORUM, 2012, 39 (03) : E209 - E209
  • [42] Identifying 'the critical' in a relationship-based model of reflection
    Ruch, Gillian
    EUROPEAN JOURNAL OF SOCIAL WORK, 2009, 12 (03) : 349 - 362
  • [43] Part-aware distillation and aggregation network for human parsing
    Lai, Yuntian
    Feng, Yuxin
    Zhou, Fan
    Su, Zhuo
    IMAGE AND VISION COMPUTING, 2025, 158
  • [44] An Administrative Model for Relationship-Based Access Control
    Stoller, Scott D.
    Data and Applications Security and Privacy XXIX, 2015, 9149 : 53 - 68
  • [45] Constructing a relationship-based brand equity model
    Chao-Hung Wang
    Li-Chang Hsu
    Shyh-Rong Fang
    Service Business, 2009, 3
  • [46] VISUAL RELATIONSHIP DETECTION BASED ON GUIDED PROPOSALS AND SEMANTIC KNOWLEDGE DISTILLATION
    Plesse, Francois
    Ginsca, Alexandru
    Delezoide, Bertrand
    Preteux, Franeoise
    2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2018,
  • [47] Relationship-Based Change Propagation: A Case Study
    Chechik, Marsha
    Lai, Winnie
    Nejati, Shiva
    Cabot, Jordi
    Diskin, Zinovy
    Easterbrook, Steve
    Sabetzadeh, Mehrdad
    Salay, Rick
    2009 ICSE WORKSHOP ON MODELING IN SOFTWARE ENGINEERING (MISE), 2009, : 7 - 12
  • [48] Relationship-Based Care: Customized Primary Nursing
    Manthey, Mane
    Lewis-Hunstiger, Marty
    CREATIVE NURSING, 2006, 12 (01) : 4 - 9
  • [49] Attention-based Feature Interaction for Efficient Online Knowledge Distillation
    Su, Tongtong
    Liang, Qiyu
    Zhang, Jinsong
    Yu, Zhaoyang
    Wang, Gang
    Liu, Xiaoguang
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 579 - 588
  • [50] Structural Knowledge Distillation for Efficient Skeleton-Based Action Recognition
    Bian, Cunling
    Feng, Wei
    Wan, Liang
    Wang, Song
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 2963 - 2976