Commercial exoskeletons currently utilize multiple sensors, including inertial measurement units, electromyography sensors, and torque/force sensors, to detect human motion. While these sensors improve motion recognition by leveraging their unique strengths, they can also lead to discomfort due to direct skin contact, added weight, and complex wiring. In this paper, we propose a simplified motion recognition method that relies solely on encoders embedded in the motors. Our approach aims to accurately classify various movements by learning their distinctive features through a deep learning model. Specifically, we employ a convolutional neural network algorithm optimized for motion classification. Experimental results show that our model can effectively differentiate between movements such as standing, lifting, level walking, and inclined walking, achieving a test accuracy of 98.76%. Additionally, by implementing a sliding window maximum algorithm that tracks three consecutive classifications, we achieved a real-time motion recognition accuracy of 97.48% with a response time of 0.25 seconds. This approach provides a cost-effective and simplified solution for lower limb motion recognition, with potential applications in rehabilitation-focused exoskeletons.
In this paper, a prosthetic robot hand was designed and fabricated and experimental evaluation of the realization of basic gripping motions was performed. As a first step, a robot finger was designed with same structural configuration of the human hand and the movement of the finger was evaluated via kinematic analysis. Electromyogram (EMG) signals for hand motions were measured using commercial wearable EMG sensors and classification of hand motions was achieved by applying the artificial neural network (ANN) algorithm. After training and testing for three kinds of gripping motions via ANN, it was observed that high classification accuracy can be obtained. A prototype of the proposed robot hand is manufactured through 3D printing and servomotors are included for position control of fingers. It was demonstrated that effective realization of gripping motions of the proposed prosthetic robot hand can be achieved by using EMG measurement and machine learning-based classification under a real-time environment.
Citations
Citations to this article as recorded by
Development of a Caterpillar-Type Walker for the Elderly People Yeon-Kyun Lee, Chang-Min Yang, Sol Kim, Ji-Yong Jung, Jung-Ja Kim Applied Sciences.2021; 12(1): 383. CrossRef
Remote Control of Mobile Robot Using Electromyogram-based Hand Gesture Recognition Daun Lee, Jung Woo Sohn Transactions of the Korean Society for Noise and Vibration Engineering.2020; 30(5): 497. CrossRef