Respiration Tracking for People Counting and Recognition

被引:52
|
作者
Wang, Fengyu [1 ,2 ]
Zhang, Feng [1 ,2 ]
Wu, Chenshu [1 ,2 ]
Wang, Beibei [1 ,2 ]
Liu, K. J. Ray [1 ,2 ]
机构
[1] Univ Maryland, Dept Elect & Comp Engn, College Pk, MD 20742 USA
[2] Origin Wireless Inc, Dept Res & Dev, Greenbelt, MD 20770 USA
关键词
Wireless fidelity; Smart homes; Wireless communication; Internet of Things; Sensors; Iterative algorithms; Dynamic programming; Crowd counting; identity matching; multipeople breathing estimation; people recognition; wireless sensing; WIFI;
D O I
10.1109/JIOT.2020.2977254
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Wireless detection of respiration rates is crucial for many applications. Most of the state-of-the-art solutions estimate breathing rates with the prior knowledge of crowd numbers as well as assuming the distinct breathing rates of different users, which is neither natural nor realistic. However, few of them can leverage the estimated breathing rates to recognize human subjects (also known as identity matching). In this article, using the channel state information (CSI) of a single pair of commercial WiFi devices, a novel system is proposed to continuously track the breathing rates of multiple persons without such impractical assumptions. The proposed solution includes an adaptive subcarrier combination method that boosts the signal-to-noise ratio (SNR) of breathing signals, and iterative dynamic programming and a trace concatenating algorithm that continuously tracks the breathing rates of multiple users. By leveraging both the spectrum and time diversity of the CSI, our system can correctly extract the breathing rate traces even if some of them merge together for a short time period. Furthermore, by utilizing the breathing traces obtained, our system can do people counting and recognition simultaneously. Extensive experiments are conducted in two environments (an on-campus lab and a car). The results show that 86% of average accuracy can be achieved for people counting up to four people for both cases. For 97.9% out of all the testing cases, the absolute error of crowd number estimates is within 1. The system achieves an average accuracy of 85.78% for people recognition in a smart home case.
引用
收藏
页码:5233 / 5245
页数:13
相关论文
共 50 条
  • [41] People Counting at Campuses
    Cetinkaya, H. Hakan
    Akcay, Muammer
    4TH WORLD CONFERENCE ON EDUCATIONAL TECHNOLOGY RESEARCHES (WCETR-2014), 2015, 182 : 732 - 736
  • [42] People counting system
    Feitosa, Raul
    Dias, Priscila
    VISAPP 2006: PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY AND APPLICATIONS, VOL 2, 2006, : 442 - +
  • [43] Counting synapses in people
    Torrice, Michael
    CHEMICAL & ENGINEERING NEWS, 2016, 94 (30) : 8 - 8
  • [44] COUNTING CHINAS PEOPLE
    GINSBURG, NS
    PROBLEMS OF COMMUNISM, 1974, 23 (05): : 56 - 60
  • [45] Counting Things and People: The Practices and Politics of Counting
    Martin, Aryn
    Lynch, Michael
    SOCIAL PROBLEMS, 2009, 56 (02) : 243 - 266
  • [46] Detecting, Tracking and Counting People Getting On/Off a Metropolitan Train Using a Standard Video Camera
    Velastin, Sergio A.
    Fernandez, Rodrigo
    Espinosa, Jorge E.
    Bay, Alessandro
    SENSORS, 2020, 20 (21) : 1 - 20
  • [47] Vehicle Detection, Tracking and Counting
    Maqbool, Safoora
    Khan, Mehwish
    Tahir, Jawaria
    Jalil, Abdul
    Ali, Ahmad
    Ahmad, Javed
    2018 IEEE 3RD INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP), 2018, : 126 - 132
  • [48] Automated Counting and Tracking of Vehicles
    Shih, Frank Y.
    Zhong, Xin
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2017, 31 (12)
  • [49] A Multitask Network for People Counting, Motion Recognition, and Localization Using Through-Wall Radar
    Lin, Junyu
    Hu, Jun
    Xie, Zhiyuan
    Zhang, Yulan
    Huang, Guangjia
    Chen, Zengping
    SENSORS, 2023, 23 (19)
  • [50] Color appearance-based approach to robust tracking and recognition of multiple people
    Tao, J
    Tan, YP
    ICICS-PCM 2003, VOLS 1-3, PROCEEDINGS, 2003, : 95 - 99