A Machine Learning Platform for Multirotor Activity Training and Recognition

被引:1
|
作者
De La Rosa, Matthew [1 ]
Chen, Yinong [1 ]
机构
[1] Arizona State Univ, Sch Comp Informat & Decis Syst Engn, Tempe, AZ 85281 USA
关键词
Machine learning; training and recognition; Internet of Things; VIPLE; cloud computing; orchestration; education; classification; multirotor; INTERNET; THINGS;
D O I
10.1109/isads45777.2019.9155812
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Machine learning is a new paradigm of problem solving. Teaching machine learning in schools and colleges to prepare the industry's needs becomes imminent, not only in computing majors, but also in all engineering disciplines. This paper develops a new, hands-on approach to teaching machine learning by training a linear classifier and applying that classifier to solve Multirotor Activity Recognition (MAR) problems in an online lab setting. MAR labs leverage cloud computing and data storage technologies to host a versatile environment capable of logging, orchestrating, and visualizing the solution for an MAR problem through a user interface. This work extends Arizona State University's Visual IoT/Robotics Programming Language Environment (VIPLE) as a control platform for multi-rotors used in data collection. VIPLE is a platform developed for teaching computational thinking, visual programming, Internet of Things (IoT) and robotics application development.
引用
收藏
页码:15 / 22
页数:8
相关论文
共 50 条
  • [41] Towards a Modular Machine Learning Architecture for Human Activity Recognition
    Schroth, Marc
    Birkenmaier, Dennis
    Stork, Wilhelm
    2024 IEEE SENSORS APPLICATIONS SYMPOSIUM, SAS 2024, 2024,
  • [42] Machine Learning Models for Activity Recognition and Authentication of Smartphone Users
    Ahmadi, S. Sareh
    Rashad, Sherif
    Elgazzar, Heba
    2019 IEEE 10TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2019, : 561 - 567
  • [43] Wind estimation by multirotor dynamic state measurement and machine learning models
    Zimmerman, Steven
    Yeremi, Miayan
    Nagamune, Ryozo
    Rogak, Steven
    MEASUREMENT, 2022, 198
  • [44] Estimating the difficulty of a learning activity from the training cost for a machine learning algorithm
    Gallego-Duran, Francisco
    Molina-Carmona, Rafael
    Llorens-Largo, Faraon
    SIXTH INTERNATIONAL CONFERENCE ON TECHNOLOGICAL ECOSYSTEMS FOR ENHANCING MULTICULTURALITY (TEEM'18), 2018, : 654 - 659
  • [45] Autonomic machine learning platform
    Lee, Keon Myung
    Yoo, Jaesoo
    Kim, Sang-Wook
    Lee, Jee-Hyong
    Hong, Jiman
    INTERNATIONAL JOURNAL OF INFORMATION MANAGEMENT, 2019, 49 : 491 - 501
  • [46] The Machine Learning Management Platform
    Chen, Meiyu
    You, Xiangdong
    PROCEEDINGS OF 2017 8TH IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND SERVICE SCIENCE (ICSESS 2017), 2017, : 739 - 742
  • [47] Data Platform for Machine Learning
    Agrawal, Pulkit
    Arya, Rajat
    Bindal, Aanchal
    Bhatia, Sandeep
    Gagneja, Anupriya
    Godlewski, Joseph
    Low, Yucheng
    Muss, Timothy
    Paliwal, Mudit Manu
    Raman, Sethu
    Shah, Vishrut
    Shen, Bochao
    Sugden, Laura
    Zhao, Kaiyu
    Wu, Ming-Chuan
    SIGMOD '19: PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2019, : 1803 - 1816
  • [48] EasyASR: A Distributed Machine Learning Platform for End-to-end Automatic Speech Recognition
    Wang, Chengyu
    Cheng, Mengli
    Hu, Xu
    Huang, Jun
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 16111 - 16113
  • [49] A Platform That Directly Evolves Multirotor Controllers
    Howard, David
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2017, 21 (06) : 943 - 955
  • [50] Machine learning and deep learning models for human activity recognition in security and surveillance: a review
    Waghchaware, Sheetal
    Joshi, Radhika
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (08) : 4405 - 4436