In-depth analysis of design & development for sensor-based human activity recognition system

被引:2
|
作者
Choudhury, Nurul Amin [1 ]
Soni, Badal [1 ]
机构
[1] Natl Inst Technol Silchar, Dept Comp Sci & Engn, Cachar 788010, Assam, India
关键词
Human activity recognition; Shallow learning; Ensemble learning; Deep learning; Activities of daily living and wearable sensors; ACCELEROMETER; FRAMEWORK;
D O I
10.1007/s11042-023-16423-5
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human Activity Recognition (HAR) has gained much attention since sensor technology has become more advanced and cost-effective. HAR is a process of identifying the daily living activities of an individual with the help of an efficient learning algorithm and prospective user-generated datasets. This paper addresses the technical advancement and classification of HAR systems in detail. Design issues, future opportunities, recent state-of-the-art related works, and a generic framework for activity recognition are discussed in a comprehensive manner with analytical discussion. Different publicly available datasets with their features and incorporated sensors are also descr-processing techniques with various performance metrics like - Accuracy, F1-score, Precision, Recall, Computational times and evaluation schemes are discussed for the comprehensive understanding of the Activity Recognition Chain (ARC). Different learning algorithms are exploited and compared for learning-based performance comparison. For each specific module of this paper, a compendious number of references is also cited for easy referencing. The main aim of this study is to give the readers an easy hands-on implementation in the field of HAR with verifiable evidence of different design issues.
引用
收藏
页码:73233 / 73272
页数:40
相关论文
共 50 条
  • [31] Evaluation of machine learning approaches for sensor-based human activity recognition
    Yousif, Hala Muhanad
    Abdulah, Dhahir Abdulhade
    INTERNATIONAL JOURNAL OF NONLINEAR ANALYSIS AND APPLICATIONS, 2022, 13 (02): : 1183 - 1200
  • [32] SenseMLP: a parallel MLP architecture for sensor-based human activity recognition
    Li, Weilin
    Guo, Jiaming
    Wu, Hong
    MULTIMEDIA SYSTEMS, 2024, 30 (04)
  • [33] Comparison of Sensor-Based Datasets for Human Activity Recognition in Wearable IoT
    Khare, Shivanjali
    Sarkar, Sayani
    Totaro, Michael
    2020 IEEE 6TH WORLD FORUM ON INTERNET OF THINGS (WF-IOT), 2020,
  • [34] Sensor-Based Human Activity Recognition in a Multi-user Scenario
    Wang, Liang
    Gu, Tao
    Tao, Xianping
    Lu, Jian
    AMBIENT INTELLIGENCE, PROCEEDINGS, 2009, 5859 : 78 - +
  • [35] Sensor-based and vision-based human activity recognition: A comprehensive survey
    Dang, L. Minh
    Min, Kyungbok
    Wang, Hanxiang
    Piran, Md. Jalil
    Lee, Cheol Hee
    Moon, Hyeonjoon
    PATTERN RECOGNITION, 2020, 108 (108)
  • [36] Deep Triplet Networks with Attention for Sensor-based Human Activity Recognition
    Khaertdinov, Bulat
    Ghaleb, Esam
    Asteriadis, Stylianos
    2021 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS (PERCOM), 2021,
  • [37] Deep learning and model personalization in sensor-based human activity recognition
    Ferrari A.
    Micucci D.
    Mobilio M.
    Napoletano P.
    Journal of Reliable Intelligent Environments, 2023, 9 (01) : 27 - 39
  • [38] AutoAugHAR: Automated Data Augmentation for Sensor-based Human Activity Recognition
    Zhou, Yexu
    Zhao, Haibin
    Huang, Yiran
    Roeddiger, Tobias
    Kurnaz, Murat
    Riedel, Till
    Beigl, Michael
    PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2024, 8 (02):
  • [39] LOCAL AND GLOBAL ALIGNMENTS FOR GENERALIZABLE SENSOR-BASED HUMAN ACTIVITY RECOGNITION
    Lu, Wang
    Wang, Jindong
    Chen, Yiqiang
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3833 - 3837
  • [40] A Hybrid Deep Neural Networks for Sensor-based Human Activity Recognition
    Wang, Shujuan
    Zhu, Xiaoke
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 486 - 491