Multi-Scale Context Aggregation Network with Attention-Guided for Crowd Counting

被引:17
|
作者
Wang, Xin [1 ,2 ]
Lv, Rongrong [1 ]
Zhao, Yang [2 ]
Yang, Tangwen [1 ]
Ruan, Qiuqi [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Inst Informat Sci, Sch Comp & Informat Technol, Beijing 100044, Peoples R China
[2] Shenzhen Univ, Guangdong Key Lab Intelligent Informat Proc, Shenzhen 518060, Guangdong, Peoples R China
关键词
dense context-aware module; hierarchical attention guided; multi-scale extraction; crowd counting;
D O I
10.1109/ICSP48669.2020.9321067
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Crowd counting aims to predict the number of people and generate the density map in the image. There are many challenges, including varying head scales, the diversity of crowd distribution across images and cluttered backgrounds. In this paper, we propose a multi-scale context aggregation network (MSCANet) based on single column encoder-decoder architecture for crowd counting, which consists of an encoder based on dense context-aware module (DCAM) and a hierarchical attention-guided decoder. To handle the issue of scale variation, we construct the DCAM to aggregate multi-scale contextual information through densely connecting the dilated convolution with varying receptive fields. The proposed DCAM can capture rich contextual information of crowd areas due to its long-range receptive fields and dense scale sampling. Moreover, to suppress the background noise and generate a high-quality density map, we adopt a hierarchical attention-guided mechanism in the decoder. This helps to integrate more useful spatial information from shallow feature maps of the encoder by introducing multiple supervision based on semantic attention module (SAM). Extensive experiments demonstrate that the proposed approach achieves better performance than other similar state-of-the-art methods on three challenging benchmark datasets for crowd counting.
引用
收藏
页码:240 / 245
页数:6
相关论文
共 50 条
  • [41] Attention Guided Encoder-Decoder Network With Multi-Scale Context Aggregation for Land Cover Segmentation
    Wang, Shuyang
    Mu, Xiaodong
    Yang, Dongfang
    He, Hao
    Zhao, Peng
    IEEE ACCESS, 2020, 8 : 215299 - 215309
  • [42] Lightweight multi-scale attention-guided network for real-time semantic segmentation
    Hu, Xuegang
    Liu, Yuanjing
    IMAGE AND VISION COMPUTING, 2023, 139
  • [43] Coarse-to-fine multi-scale attention-guided network for multi-exposure image fusion
    Hao Zhao
    Jingrun Zheng
    Xiaoke Shang
    Wei Zhong
    Jinyuan Liu
    The Visual Computer, 2024, 40 : 1697 - 1710
  • [44] Coarse-to-fine multi-scale attention-guided network for multi-exposure image fusion
    Zhao, Hao
    Zheng, Jingrun
    Shang, Xiaoke
    Zhong, Wei
    Liu, Jinyuan
    VISUAL COMPUTER, 2024, 40 (03): : 1697 - 1710
  • [45] MULTI-STEP QUANTIZATION OF A MULTI-SCALE NETWORK FOR CROWD COUNTING
    Shim, Kyujin
    Byun, Junyoung
    Kim, Changick
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 683 - 687
  • [46] Attention-Guided Multi-Scale Feature Fusion Network for Low-Light Image Enhancement
    Cui, HengShuai
    Li, Jinjiang
    Hua, Zhen
    Fan, Linwei
    FRONTIERS IN NEUROROBOTICS, 2022, 16
  • [47] Noise Suppression of DAS Seismic Data by Attention-guided Multi-scale Generative Adversarial Network
    Wu N.
    Wang Y.
    Li Y.
    Geophysics, 2023, 88 (03)
  • [48] Crowd Counting Method Based on Multi-Scale Enhanced Network
    Xu Tao
    Duan Yinong
    Du Jiahao
    Liu Caihua
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2021, 43 (06) : 1764 - 1771
  • [49] A Residual UNet Denoising Network Based on Multi-Scale Feature Extraction and Attention-Guided Filter
    Liu, Hualin
    Li, Zhe
    Lin, Shijie
    Cheng, Libo
    SENSORS, 2023, 23 (16)
  • [50] A Novel Multi-Scale Channel Attention-Guided Neural Network for Brain Stroke Lesion Segmentation
    Li, Zhihua
    Xing, Qiwei
    Li, Yanfang
    He, Wei
    Miao, Yu
    Ji, Bai
    Shi, Weili
    Jiang, Zhengang
    IEEE ACCESS, 2023, 11 (66050-66062) : 66050 - 66062