Using Cultural Algorithms to Improve Wearable Device Gesture Recognition Performance

Author(s):  
Faisal Waris ◽  
Robert G. Reynolds
Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 1007
Author(s):  
Chi Xu ◽  
Yunkai Jiang ◽  
Jun Zhou ◽  
Yi Liu

Hand gesture recognition and hand pose estimation are two closely correlated tasks. In this paper, we propose a deep-learning based approach which jointly learns an intermediate level shared feature for these two tasks, so that the hand gesture recognition task can be benefited from the hand pose estimation task. In the training process, a semi-supervised training scheme is designed to solve the problem of lacking proper annotation. Our approach detects the foreground hand, recognizes the hand gesture, and estimates the corresponding 3D hand pose simultaneously. To evaluate the hand gesture recognition performance of the state-of-the-arts, we propose a challenging hand gesture recognition dataset collected in unconstrained environments. Experimental results show that, the gesture recognition accuracy of ours is significantly boosted by leveraging the knowledge learned from the hand pose estimation task.


Sensors ◽  
2019 ◽  
Vol 19 (16) ◽  
pp. 3548 ◽  
Author(s):  
Piotr Kaczmarek ◽  
Tomasz Mańkowski ◽  
Jakub Tomczyński

In this paper, we present a putEMG dataset intended for the evaluation of hand gesture recognition methods based on sEMG signal. The dataset was acquired for 44 able-bodied subjects and include 8 gestures (3 full hand gestures, 4 pinches and idle). It consists of uninterrupted recordings of 24 sEMG channels from the subject’s forearm, RGB video stream and depth camera images used for hand motion tracking. Moreover, exemplary processing scripts are also published. The putEMG dataset is available under a Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0). The dataset was validated regarding sEMG amplitudes and gesture recognition performance. The classification was performed using state-of-the-art classifiers and feature sets. An accuracy of 90% was achieved for SVM classifier utilising RMS feature and for LDA classifier using Hudgin’s and Du’s feature sets. Analysis of performance for particular gestures showed that LDA/Du combination has significantly higher accuracy for full hand gestures, while SVM/RMS performs better for pinch gestures. The presented dataset can be used as a benchmark for various classification methods, the evaluation of electrode localisation concepts, or the development of classification methods invariant to user-specific features or electrode displacement.


Author(s):  
Eichi Tamura ◽  
◽  
Yoshihiro Yamashita ◽  
Taisei Yamashita ◽  
Eri Sato-Shimokawara ◽  
...  

Finger pointing is an intuitive method for people to direct a robot to move to a certain location. We propose a system that enables the movement operation of a mobility robot by using finger-pointing gestures for an automatic and intuitive driving experience. We employ a method to recognize gestures by using video images from a USB camera mounted on a wearable device. Our method does not require the use of infrared sensors. Three movement commands for forward motion, turning, and stopping are chosen based on gesture recognition, face orientation detection, and an intelligent safety system. We experimentally demonstrate the usefulness of the system using a scooter-type mobility robot.


2020 ◽  
Vol 17 (1) ◽  
pp. 177-181 ◽  
Author(s):  
Amritha Purushothaman ◽  
Suja Palaniswamy

Smart home has gained popularity not only as a luxury but also due to the numerous advantages. It is especially useful for senior citizens and children with disabilities. In this work, home automation is achieved using gesture for controlling appliances. Gesture recognition is an area in which lot of research and innovations are blooming. This paper discusses the development of a wearable device which captures hand gestures. The wearable device uses accelerometer and gyroscopes to sense and capture tilting, rotation and acceleration of the hand movement. Four different hand gestures are captured using this wearable device and machine learning algorithm namely Support Vector Machine has been used for classification of gestures to control ON/OFF of appliances.


2020 ◽  
Vol 25 (6) ◽  
pp. 2447-2458 ◽  
Author(s):  
Shu Shen ◽  
Kang Gu ◽  
Xin-Rong Chen ◽  
Cai-Xia Lv ◽  
Ru-Chuan Wang

Sensors ◽  
2019 ◽  
Vol 19 (18) ◽  
pp. 3827 ◽  
Author(s):  
Minwoo Kim ◽  
Jaechan Cho ◽  
Seongjoo Lee ◽  
Yunho Jung

We propose an efficient hand gesture recognition (HGR) algorithm, which can cope with time-dependent data from an inertial measurement unit (IMU) sensor and support real-time learning for various human-machine interface (HMI) applications. Although the data extracted from IMU sensors are time-dependent, most existing HGR algorithms do not consider this characteristic, which results in the degradation of recognition performance. Because the dynamic time warping (DTW) technique considers the time-dependent characteristic of IMU sensor data, the recognition performance of DTW-based algorithms is better than that of others. However, the DTW technique requires a very complex learning algorithm, which makes it difficult to support real-time learning. To solve this issue, the proposed HGR algorithm is based on a restricted column energy (RCE) neural network, which has a very simple learning scheme in which neurons are activated when necessary. By replacing the metric calculation of the RCE neural network with DTW distance, the proposed algorithm exhibits superior recognition performance for time-dependent sensor data while supporting real-time learning. Our verification results on a field-programmable gate array (FPGA)-based test platform show that the proposed HGR algorithm can achieve a recognition accuracy of 98.6% and supports real-time learning and recognition at an operating frequency of 150 MHz.


Sign in / Sign up

Export Citation Format

Share Document