Class relationship-based knowledge distillation for efficient human parsing

被引:1
|
作者
Lang, Yuqi [1 ]
Liu, Kunliang [1 ,2 ]
Wang, Jianming [2 ]
Hwang, Wonjun [1 ]
机构
[1] Ajou Univ, Dept AI, Suwon, Gyeonggi Do, South Korea
[2] Tiangong Univ, Tianjin, Peoples R China
关键词
artificial intelligence; computer vision; image recognition; neural nets;
D O I
10.1049/ell2.12900
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In computer vision, human parsing is challenging due to its demand for accurate human region location and semantic partitioning. This dense prediction task needs powerful computation and high-precision models. To enable real-time parsing on resource-limited devices, the authors introduced a lightweight model using ResNet18 as a core network . The authors simplified the pyramid module, improving context clarity and reducing complexity. The authors integrated a spatial attention fusion strategy to counter precision loss in the light-weighting process. Traditional models, despite their segmentation precision, are limited by their computational complexity and extensive parameters. The authors implemented knowledge distillation (KD) techniques to enhance the authors' lightweight network's accuracy. Traditional methods can fail to learn useful knowledge with significant network differences. Hence, the authors used a novel distillation approach based on inter-class and intra-class relations in prediction outcomes, noticeably improving parsing accuracy. The authors' experiments on the Look into Person (LIP) dataset show that their lightweight model significantly reduces parameters while maintaining parsing precision and enhancing inference speed.
引用
收藏
页数:3
相关论文
共 50 条
  • [1] HiveRel: hexagons visualization for relationship-based knowledge acquisition
    Sivan Yogev
    Guy Shani
    Noam Tractinsky
    CCF Transactions on Pervasive Computing and Interaction, 2022, 4 : 408 - 436
  • [2] HiveRel: hexagons visualization for relationship-based knowledge acquisition
    Yogev, Sivan
    Shani, Guy
    Tractinsky, Noam
    CCF TRANSACTIONS ON PERVASIVE COMPUTING AND INTERACTION, 2022, 4 (04) : 408 - 436
  • [3] An efficient preprocessing stage for the relationship-based clustering framework
    Bilgin, Turgay Tugay
    Camurcu, Ali Yilmaz
    INTELLIGENT DATA ANALYSIS, 2010, 14 (06) : 731 - 748
  • [4] Multilingual AMR Parsing with Noisy Knowledge Distillation☆
    Cai, Deng
    Li, Xin
    Ho, Jackie Chun-Sing
    Bing, Lidong
    Lam, Wai
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2778 - 2789
  • [5] Class Attention Transfer Based Knowledge Distillation
    Guo, Ziyao
    Yan, Haonan
    Li, Hui
    Lin, Xiaodong
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 11868 - 11877
  • [6] Efficient and Extensible Policy Mining for Relationship-Based Access Control
    Bui, Thang
    Stoller, Scott D.
    Le, Hieu
    PROCEEDINGS OF THE 24TH ACM SYMPOSIUM ON ACCESS CONTROL MODELS AND TECHNOLOGIES (SACMAT '19), 2019, : 161 - 172
  • [7] Adaptive class token knowledge distillation for efficient vision transformer
    Kang, Minchan
    Son, Sanghyeok
    Kim, Daeshik
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [8] Psychosocial and Relationship-Based Practice
    Turner, Denise
    BRITISH JOURNAL OF SOCIAL WORK, 2015, 45 (06): : 1935 - 1937
  • [9] The Emergence of Relationship-based Cooperation
    Xu, Bo
    Wang, Jianwei
    SCIENTIFIC REPORTS, 2015, 5
  • [10] Psychosocial and relationship-based practice
    Archard, Philip John
    SOCIAL WORK EDUCATION, 2016, 35 (02) : 238 - +