Segmentation Guided Attention Networks for Human Pose Estimation

被引:0
|
作者
Tang, Jingfan [1 ]
Lu, Jipeng [1 ]
Zhang, Xuefeng [2 ,3 ]
Zhao, Fang [4 ]
机构
[1] Hangzhou Dianzi Univ, Coll Comp, Hangzhou 310018, Peoples R China
[2] Ningbo Univ, Coll Sci & Technol, Lab Intelligent Home Appliances, Ningbo 315300, Peoples R China
[3] Ningbo Univ, Coll Sci & Technol, Sch Informat Engn, Ningbo 315300, Peoples R China
[4] Zhejiang Shuren Univ, Coll Informat Sci & Technol, Hangzhou 310015, Peoples R China
关键词
human pose estimation; segmentation guided attention; spatial attention maps; deep learning; accuracy improvement;
D O I
10.18280/ts.410522
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human pose estimation is an important and widely studied task in computer vision. One of the difficulties inAhuman pose estimation is that the model is vulnerable to complex backgrounds when making predictions. In this paper, we propose a deep high-resolution network based on segmentation guided. A conceptually simple but computationally efficient segmentation guided module is used to generate segmentation maps. The obtained segmentation map will be used as a spatial attention map in the feature extraction stage. Since the skeletal point region is used as the foreground in the segmentation map, the model pays more attention to the key point region to effectively reduce the influence of complex background on the prediction results. The segmentation guided module provides a spatial attention map with a priori knowledge, unlike the traditional spatial attention mechanism. To verify the effectiveness of our method, we conducted a series of comparison experiments on the MPII human pose dataset and the COCO2017 keypoint detection dataset. The highest boosting effect of our model compared to HRNet on the COCO2017 dataset is up to 3%. The experimental results show that this segmentation guidance mechanism is effective in improving accuracy.
引用
收藏
页码:2485 / 2493
页数:9
相关论文
共 50 条
  • [1] Improving Human Pose Estimation With Self-Attention Generative Adversarial Networks
    Wang, Xiangyang
    Cao, Zhongzheng
    Wang, Rui
    Liu, Zhi
    Zhu, Xiaoqiang
    IEEE ACCESS, 2019, 7 : 119668 - 119680
  • [2] IMPROVING HUMAN POSE ESTIMATION WITH SELF-ATTENTION GENERATIVE ADVERSARIAL NETWORKS
    Cao, Zhongzheng
    Wang, Rui
    Wang, Xiangyang
    Liu, Zhi
    Zhu, Xiaoqiang
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2019, : 567 - 572
  • [3] Knowledge-Guided Deep Fractal Neural Networks for Human Pose Estimation
    Ning, Guanghan
    Zhang, Zhi
    He, Zhiquan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2018, 20 (05) : 1246 - 1259
  • [4] Pose estimation based on human detection and segmentation
    Chen Qiang
    Zheng EnLiang
    Liu YunCai
    SCIENCE IN CHINA SERIES F-INFORMATION SCIENCES, 2009, 52 (02): : 244 - 251
  • [5] Pose estimation based on human detection and segmentation
    CHEN Qiang
    Science China(Information Sciences), 2009, (02) : 244 - 251
  • [6] Pose estimation based on human detection and segmentation
    CHEN QiangZHENG EnLiang LIU YunCai Institute of Image Processing and Pattern RecognitionShanghai Jiao Tong UniversityShanghai China
    ScienceinChina(SeriesF:InformationSciences), 2009, 52 (02) : 244 - 251
  • [7] Pose estimation based on human detection and segmentation
    Qiang Chen
    EnLiang Zheng
    YunCai Liu
    Science in China Series F: Information Sciences, 2009, 52 : 244 - 251
  • [8] Integrating Grammar and Segmentation for Human Pose Estimation
    Rothrock, Brandon
    Park, Seyoung
    Zhu, Song-Chun
    2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, : 3214 - 3221
  • [9] Lightweight Human Pose Estimation with Attention Mechanism
    Chu Xiaoshuai
    Ji Ruirui
    Dong Danyang
    Xi Yuzhuo
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705
  • [10] Attention Refined Network for Human Pose Estimation
    Wang, Xiangyang
    Tong, Jiangwei
    Wang, Rui
    NEURAL PROCESSING LETTERS, 2021, 53 (04) : 2853 - 2872