Motion recognition and classification are crucial for exoskeleton applications in rehabilitation, activities of daily living (ADL), and entertainment. Accurate activity analysis is essential to improve human-machine coupling. However, conventional single-task detection systems, which focus on specific requirements, such as finite gait events, mode transitions (such as standing-to-sitting), or locomotion speed, are inadequate and cannot handle the complex and varied walking environments encountered during ADL. This article proposes a real-time, multiclassifier system that incorporates three artificial neural network (ANN) models to simultaneously recognize five gait events, nine activities, and walking speeds ranging from 0 to 8 km/h. Three machine-learning (ML) algorithms were fused and utilized to minimize reliance on manual thresholding methods. The activity detection, speed recognition, and gait detection were performed using a 1-dimension convolutional neural network (1-D-CNN), regression ANN (RANN), and multilayer perceptron (MLP), respectively. The experiment was conducted with five subjects wearing a developing cable-driven exoskeleton. The results demonstrate that the proposed portable motion recognition system accurately detected various movements, including gait events with 99.6% accuracy and a time error of 33 ms, a recognize speed with a mean square error (MSE) of 0.12, and an activity detection with 96.8% accuracy.