A Pilot Study to Detect Agitation in People Living with Dementia Using Multi-Modal Sensors

被引:24
|
作者
Spasojevic, S. [1 ,2 ]
Nogas, J. [2 ]
Iaboni, A. [1 ,2 ]
Ye, B. [1 ,2 ]
Mihailidis, A. [1 ,2 ]
Wang, A. [3 ]
Li, S. J. [3 ]
Martin, L. S. [3 ]
Newman, K. [3 ]
Khan, S. S. [1 ,2 ]
机构
[1] Toronto Rehab Univ Hlth Network, KITE, Toronto, ON, Canada
[2] Univ Toronto, Toronto, ON, Canada
[3] Ryerson Univ, Toronto, ON, Canada
关键词
Agitation; Dementia; Multi-modal sensors; Machine learning; Feature extraction; ELECTRODERMAL ACTIVITY;
D O I
10.1007/s41666-021-00095-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
People living with dementia (PLwD) often exhibit behavioral and psychological symptoms, such as episodes of agitation and aggression. Agitated behavior in PLwD causes distress and increases the risk of injury to both patients and caregivers. In this paper, we present the use of a multi-modal wearable device that captures motion and physiological indicators to detect agitation in PLwD. We identify features extracted from sensor signals that are the most relevant for agitation detection. We hypothesize that combining multi-modal sensor data will be more effective to identify agitation in PLwD in comparison to a single sensor. The results of this unique pilot study are based on 17 participants' data collected during 600 days from PLwD admitted to a Specialized Dementia Unit. Our findings show the importance of using multi-modal sensor data and highlight the most significant features for agitation detection.
引用
收藏
页码:342 / 358
页数:17
相关论文
共 50 条
  • [21] Investigating Multimodal Sensor Features Importance to Detect Agitation in People with Dementia
    Badawi, Abeer
    Elgazzar, Khalid
    Ye, Bing
    Newman, Kristine
    Mihailidis, Alex
    Iaboni, Andrea
    Khan, Shehroz S.
    2023 IEEE CANADIAN CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING, CCECE, 2023,
  • [22] Unsupervised Deep Learning to Detect Agitation From Videos in People With Dementia
    Khan, Shehroz S.
    Mishra, Pratik K.
    Javed, Nizwa
    Ye, Bing
    Newman, Kristine
    Mihailidis, Alex
    Iaboni, Andrea
    IEEE ACCESS, 2022, 10 : 10349 - 10358
  • [23] Multi-modal Conversational Search for People with Intellectual Disability: An Exploratory Study
    Roomkham, Sirinthip
    Terris, Shannon
    Sitbon, Laurianne
    EXTENDED ABSTRACTS OF THE 2022 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2022, 2022,
  • [24] A longitudinal multi-modal dataset for dementia monitoring and diagnosis
    Gkoumas, Dimitris
    Wang, Bo
    Tsakalidis, Adam
    Wolters, Maria
    Purver, Matthew
    Zubiaga, Arkaitz
    Liakata, Maria
    LANGUAGE RESOURCES AND EVALUATION, 2024, 58 (03) : 883 - 902
  • [25] Federated Learning Using Multi-Modal Sensors with Heterogeneous Privacy Sensitivity Levels
    Hsu, Chih-fan
    Li, Yi-chen
    Tsai, Chung-chi
    Wang, Jian-kai
    Hsu, Cheng-hsin
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (11)
  • [26] Granular estimation of user cognitive workload using multi-modal physiological sensors
    Wang, Jingkun
    Stevens, Christopher
    Bennett, Winston
    Yu, Denny
    FRONTIERS IN NEUROERGONOMICS, 2024, 5
  • [27] Recognizing Human Activities from Multi-Modal Sensors
    Chen, Shu
    Huang, Yan
    ISI: 2009 IEEE INTERNATIONAL CONFERENCE ON INTELLIGENCE AND SECURITY INFORMATICS, 2009, : 220 - 222
  • [28] Can Ensemble Deep Learning Identify People by Their Gait Using Data Collected from Multi-Modal Sensors in Their Insole?
    Moon, Jucheol
    Minaya, Nelson Hebert
    Nhat Anh Le
    Park, Hee-Chan
    Choi, Sang-Il
    SENSORS, 2020, 20 (14) : 1 - 15
  • [29] Detection Methods for Multi-Modal Inertial Gas Sensors
    Najar, Fehmi
    Ghommem, Mehdi
    Kocer, Samed
    Elhady, Alaa
    Abdel-Rahman, Eihab M.
    SENSORS, 2022, 22 (24)
  • [30] InN Nanowires Based Multi-Modal Environmental Sensors
    Wilson, Alina
    Jahangir, Ifat
    Quddus, Ehtesham B.
    Singh, Amol K.
    Koley, Goutam
    2013 IEEE SENSORS, 2013, : 254 - 257