PHYS.IO: Wearable hand tracking device

Author(s):  
Lucas Silva ◽  
Rummenigge Dantas ◽  
Paula Diniz ◽  
Victor Jeronimo ◽  
Luque Bueno ◽  
...  
2021 ◽  
Vol 11 (7) ◽  
pp. 2943
Author(s):  
Francisco Gomez-Donoso ◽  
Felix Escalona ◽  
Nadia Nasri ◽  
Miguel Cazorla

In this work, we introduce HaReS, a hand rehabilitation system. Our proposal integrates a series of exercises, jointly developed with a foundation for those with motor and cognitive injuries, that are aimed at improving the skills of patients and the adherence to the rehabilitation plan. Our system takes advantage of a low-cost hand-tracking device to provide a quantitative analysis of the performance of the patient. It also integrates a low-cost surface electromyography (sEMG) sensor in order to provide insight about which muscles are being activated while completing the exercises. It is also modular and can be deployed on a social robot. We tested our proposal in two different facilities for rehabilitation with high success. The therapists and patients felt more motivation while using HaReS, which improved the adherence to the rehabilitation plan. In addition, the therapists were able to provide services to more patients than when they used their traditional methodology.


Author(s):  
Umema H. Bohari ◽  
Ryan Alli ◽  
Alejandra Garcia ◽  
Vinayak R. Krishnamurthy

Abstract Drawing curves is a fundamental task in mid-air interactive applications such as 3D sketching, geometric modeling, hand-writing recognition, and authentication. Existing research in mid-air drawing is solely focused on determining what the user drew assuming that the intended curve is segmented from the continuous user-generated trajectory. In this work, our aim is to address the complementary problem: to determine when the user actually intended to draw without the use of any prescribed gestures or hand-held controllers (e.g., Wii remote, HTC Vive). In our previously published work, we demonstrated that in mid-air drawing tasks, not only it is possible to statistically learn drawing intent from hand motion, but it is also perceived to be more natural by users. Our idea was to simply classify each instance of hand trajectories as either a stroke or a hover. Our current work investigates new representations of the users’ motion beyond a single point (such as a tracked palm) to richer multi-point trajectories obtained with other skeletal joints such as wrist and elbow. We trained several binary classifiers on five such trajectory representations obtained from 3D drawing data from 25 users using a hand tracking device. We compare these representations and the corresponding classifiers for predicting user intent for mid-air drawing. Our extended approach resulted in improved prediction accuracy (mean: 80.17%, min: 79.92%, max: 91.30%) with respect to our earlier work (mean: 76.75%, min: 74.23%, max: 84.01%).


2012 ◽  
Vol 21 (1) ◽  
pp. 11-16 ◽  
Author(s):  
Susan Fager ◽  
Tom Jakobs ◽  
David Beukelman ◽  
Tricia Ternus ◽  
Haylee Schley

Abstract This article summarizes the design and evaluation of a new augmentative and alternative communication (AAC) interface strategy for people with complex communication needs and severe physical limitations. This strategy combines typing, gesture recognition, and word prediction to input text into AAC software using touchscreen or head movement tracking access methods. Eight individuals with movement limitations due to spinal cord injury, amyotrophic lateral sclerosis, polio, and Guillain Barre syndrome participated in the evaluation of the prototype technology using a head-tracking device. Fourteen typical individuals participated in the evaluation of the prototype using a touchscreen.


Sign in / Sign up

Export Citation Format

Share Document