Approaching the Real-World: Supporting Activity Recognition Training with Virtual IMU Data

被引:22
|
作者
Kwon, Hyeokhyen [1 ]
Wang, Bingyao [2 ]
Abowd, Gregory D. [3 ]
Ploetz, Thomas [1 ]
机构
[1] Georgia Inst Technol, Sch Interact Comp, Atlanta, GA 30332 USA
[2] Georgia Inst Technol, Coll Comp, Atlanta, GA 30332 USA
[3] Northeastern Univ, Dept Elect & Comp Engn, Boston, MA 02115 USA
关键词
Activity Recognition; Data Collection; Machine Learning;
D O I
10.1145/3478096
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, IMUTube introduced a paradigm change for bootstrapping human activity recognition (HAR) systems for wearables. The key idea is to utilize videos of activities to support training activity recognizers based on inertial measurement units (IMUs). This system retrieves video from public repositories and subsequently generates virtual IMU data from this. The ultimate vision for such a system is to make large amounts of weakly labeled videos accessible for model training in HAR and, as such, to overcome one of the most pressing issues in the field: the lack of significant amounts of labeled sample data. In this paper we present the first in-detail exploration of IMUTube in a realistic assessment scenario: the analysis of free-weight gym exercises. We make significant progress towards a flexible, fully-functional IMUTube system by extending it such that it can handle a range of artifacts that are common in unrestricted online videos, including various forms of video noise, non-human poses, body part occlusions, and extreme camera and human motion. By overcoming these real-world challenges, we are able to generate high-quality virtual IMU data, which allows us to employ IMUTube for practical analysis tasks. We show that HAR systems trained by incorporating virtual sensor data generated by IMUTube significantly outperform baseline models trained only with real IMU data. In doing so we demonstrate the practical utility of IMUTube and the progress made towards the final vision of the new bootstrapping paradigm.
引用
收藏
页数:32
相关论文
共 50 条
  • [21] Analyzing "real-world" anomalous data after experimentation with a virtual laboratory
    Toth, Eva Erdosne
    ETR&D-EDUCATIONAL TECHNOLOGY RESEARCH AND DEVELOPMENT, 2016, 64 (01): : 157 - 173
  • [22] Virtual and Real-World Ontology Services
    Eno, Joshua D.
    Thompson, Craig W.
    IEEE INTERNET COMPUTING, 2011, 15 (05) : 46 - 52
  • [23] Assessing Real-World Data Quality: The Application of Patient Registry Quality Criteria to Real-World Data and Real-World Evidence
    Richard E. Gliklich
    Michelle B. Leavy
    Therapeutic Innovation & Regulatory Science, 2020, 54 : 303 - 307
  • [24] Assessing Real-World Data Quality: The Application of Patient Registry Quality Criteria to Real-World Data and Real-World Evidence
    Gliklich, Richard E.
    Leavy, Michelle B.
    THERAPEUTIC INNOVATION & REGULATORY SCIENCE, 2020, 54 (02) : 303 - 307
  • [25] Object-Based Goal Recognition Using Real-World Data
    Granada, Roger
    Monteiro, Juarez
    Gavenski, Nathan
    Meneguzzi, Felipe
    ADVANCES IN SOFT COMPUTING, MICAI 2020, PT I, 2020, 12468 : 325 - 337
  • [26] Surgical Phase Recognition: From Public Datasets to Real-World Data
    Kirtac, Kadir
    Aydin, Nizamettin
    Lavanchy, Joeel L.
    Beldi, Guido
    Smit, Marco
    Woods, Michael S.
    Aspart, Florian
    APPLIED SCIENCES-BASEL, 2022, 12 (17):
  • [27] Editorial: Real-world data and real-world evidence in lung cancer
    Gristina, Valerio
    Eze, Chukwuka
    FRONTIERS IN ONCOLOGY, 2024, 14
  • [28] Inaccurate Real-World Data Does Not Provide Real-World Answers
    Buffet, Gabriela
    Mendoza-Sassi, Raul
    Fysekidis, Marinos
    AMERICAN JOURNAL OF THERAPEUTICS, 2021, 28 (05) : E596 - E598
  • [29] For insights into the real world, consider real-world data
    Raoof, Sana
    Kurzrock, Razelle
    SCIENCE TRANSLATIONAL MEDICINE, 2022, 14 (673)
  • [30] Editorial: Real-world data and real-world evidence in hematologic malignancies
    Malagola, Michele
    Ohgami, Robert
    Greco, Raffaella
    FRONTIERS IN ONCOLOGY, 2023, 13