CapsGaNet: Deep Neural Network Based on Capsule and GRU for Human Activity Recognition

被引:19
|
作者
Sun, Xiaojie [1 ]
Xu, Hongji [1 ]
Dong, Zheng [1 ]
Shi, Leixin [1 ]
Liu, Qiang [1 ]
Li, Juan [1 ]
Li, Tiankuo [1 ]
Fan, Shidi [1 ]
Wang, Yuhao [1 ]
机构
[1] Shandong Univ, Sch Informat Sci & Engn, Qingdao 266237, Peoples R China
来源
IEEE SYSTEMS JOURNAL | 2022年 / 16卷 / 04期
关键词
Feature extraction; Deep learning; Convolutional neural networks; Activity recognition; Convolution; Sensors; Kernel; Aggressive activity; deep learning; human activity recognition (HAR); spatiotemporal feature; WEARABLE SENSOR;
D O I
10.1109/JSYST.2022.3153503
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The advances in deep learning with the ability to automatically extract advanced features have achieved a bright prospect for human activity recognition (HAR). However, the traditional HAR methods still have the deficiencies of incomplete feature extraction, which may lead to incorrect recognition results. To resolve the above problem, a novel framework for spatiotemporal multi-feature extraction on HAR called CapsGaNet is propounded, which is based on capsule and gated recurrent units (GRU) with attention mechanisms. The proposed framework involves a spatial feature extraction layer consisting of capsule blocks, a temporal feature extraction layer consisting of GRU with attention mechanisms, and an output layer. At the same time, considering the actual demands for recognizing aggressive activities in some specific scenarios like smart prison, we constructed a daily and aggressive activity dataset (DAAD). Moreover, based on the acceleration characteristics of aggressive activity, a threshold-based approach for aggressive activity detection is propounded to meet the needs of high real-time and low computational complexity in prison scenarios. The experiments are performed on the wireless sensor data mining (WISDM) dataset and the DAAD dataset, and the results verify that the propounded CapsGaNet could effectually improve the recognition accuracy. The proposed threshold-based approach for aggressive activity detection provides a more effective HAR way by using smart sensor devices in smart prison scenarios.
引用
收藏
页码:5845 / 5855
页数:11
相关论文
共 50 条
  • [11] Human Activity Recognition with a Time Distributed Deep Neural Network
    Pareek, Gunjan
    Nigam, Swati
    Shastri, Anshuman
    Singh, Rajiv
    INTELLIGENT HUMAN COMPUTER INTERACTION, IHCI 2023, PT II, 2024, 14532 : 127 - 136
  • [12] DCapsNet: Deep capsule network for human activity and gait recognition with smartphone sensors
    Sezavar, Ahmadreza
    Atta, Randa
    Ghanbari, Mohammed
    PATTERN RECOGNITION, 2024, 147
  • [13] A fuzzy convolutional attention-based GRU network for human activity recognition
    Khodabandelou, Ghazaleh
    Moon, Huiseok
    Amirat, Yacine
    Mohammed, Samer
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 118
  • [14] Human Activity Recognition Based on Gramian Angular Field and Deep Convolutional Neural Network
    Xu, Hongji
    Li, Juan
    Yuan, Hui
    Liu, Qiang
    Fan, Shidi
    Li, Tiankuo
    Sun, Xiaojie
    IEEE ACCESS, 2020, 8 (08): : 199393 - 199405
  • [15] Optimization of deep neural network-based human activity recognition for a wearable device
    Suwannarat, K.
    Kurdthongmee, W.
    HELIYON, 2021, 7 (08)
  • [16] Recognition of human activity using GRU deep learning algorithm
    Saeed Mohsen
    Multimedia Tools and Applications, 2023, 82 : 47733 - 47749
  • [18] Human Activity Recognition Based On Convolutional Neural Network
    Xu, Wenchao
    Pang, Yuxin
    Yang, Yanqin
    Liu, Yanbo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 165 - 170
  • [19] Human Activity Recognition Based on Convolutional Neural Network
    Coelho, Yves
    Rangel, Luara
    dos Santos, Francisco
    Frizera-Neto, Anselmo
    Bastos-Filho, Teodiano
    XXVI BRAZILIAN CONGRESS ON BIOMEDICAL ENGINEERING, CBEB 2018, VOL. 2, 2019, 70 (02): : 247 - 252
  • [20] Human ear recognition based on deep convolutional neural network
    Tian Ying
    Wang Shining
    Li Wanxiang
    PROCEEDINGS OF THE 30TH CHINESE CONTROL AND DECISION CONFERENCE (2018 CCDC), 2018, : 1830 - 1835