bimanual interaction
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 1)

H-INDEX

6
(FIVE YEARS 0)

2021 ◽  
Vol 15 ◽  
Author(s):  
Marleen J. Schoenfeld ◽  
Ioana-Florentina Grigoras ◽  
Charlotte J. Stagg ◽  
Catharina Zich

Many tasks require the skilled interaction of both hands, such as eating with knife and fork or keyboard typing. However, our understanding of the behavioural and neurophysiological mechanisms underpinning bimanual motor learning is still sparse. Here, we aimed to address this by first characterising learning-related changes of different levels of bimanual interaction and second investigating how beta tACS modulates these learning-related changes. To explore early bimanual motor learning, we designed a novel bimanual motor learning task. In the task, a force grip device held in each hand (controlling x- and y-axis separately) was used to move a cursor along a path of streets at different angles (0°, 22.5°, 45°, 67.5°, and 90°). Each street corresponded to specific force ratios between hands, which resulted in different levels of hand interaction, i.e., unimanual (Uni, i.e., 0°, 90°), bimanual with equal force (Bieq, 45°), and bimanual with unequal force (Biuneq 22.5°, 67.5°). In experiment 1, 40 healthy participants performed the task for 45 min with a minimum of 100 trials. We found that the novel task induced improvements in movement time and error, with no trade-off between movement time and error, and with distinct patterns for the three levels of bimanual interaction. In experiment 2, we performed a between-subjects, double-blind study in 54 healthy participants to explore the effect of phase synchrony between both sensorimotor cortices using tACS at the individual’s beta peak frequency. The individual’s beta peak frequency was quantified using electroencephalography. 20 min of 2 mA peak-to-peak amplitude tACS was applied during task performance (40 min). Participants either received in-phase (0° phase shift), out-of-phase (90° phase shift), or sham (3 s of stimulation) tACS. We replicated the behavioural results of experiment 1, however, beta tACS did not modulate motor learning. Overall, the novel bimanual motor task allows to characterise bimanual motor learning with different levels of bimanual interaction. This should pave the way for future neuroimaging studies to further investigate the underlying mechanism of bimanual motor learning.


Author(s):  
Robin Volkmar ◽  
Strahinja Dosen ◽  
Jose Gonzalez-Vargas ◽  
Marcus Baum ◽  
Marko Markovic

Abstract Background The loss of a hand is a traumatic experience that substantially compromises an individual’s capability to interact with his environment. The myoelectric prostheses are state-of-the-art (SoA) functional replacements for the lost limbs. Their overall mechanical design and dexterity have improved over the last few decades, but the users have not been able to fully exploit these advances because of the lack of effective and intuitive control. Bimanual tasks are particularly challenging for an amputee since prosthesis control needs to be coordinated with the movement of the sound limb. So far, the bimanual activities have been often neglected by the prosthetic research community. Methods We present a novel method to prosthesis control, which uses a semi-autonomous approach in order to simplify bimanual interactions. The approach supplements the commercial SoA two-channel myoelectric control with two additional sensors. Two inertial measurement units were attached to the prosthesis and the sound hand to detect the movement of both limbs. Once a bimanual interaction is detected, the system mimics the coordination strategies of able-bodied subjects to automatically adjust the prosthesis wrist rotation (pronation, supination) and grip type (lateral, palmar) to assist the sound hand during a bimanual task. The system has been evaluated in eight able-bodied subjects performing functional uni- and bi-manual tasks using the novel method and SoA two-channel myocontrol. The outcome measures were time to accomplish the task, semi-autonomous system misclassification rate, subjective rating of intuitiveness, and perceived workload (NASA TLX). Results The results demonstrated that the novel control interface substantially outperformed the SoA myoelectric control. While using the semi-autonomous control the time to accomplish the task and the perceived workload decreased for 25 and 27%, respectively, while the subjects rated the system as more intuitive then SoA myocontrol. Conclusions The novel system uses minimal additional hardware (two inertial sensors) and simple processing and it is therefore convenient for practical implementation. By using the proposed control scheme, the prosthesis assists the user’s sound hand in performing bimanual interactions while decreasing cognitive burden.


2019 ◽  
Vol 28 (2) ◽  
pp. 63-78 ◽  
Author(s):  
Martí Sánchez-Fibla ◽  
Sébastien Forestier ◽  
Clément Moulin-Frier ◽  
Jordi-Ysard Puigbò ◽  
Paul FMJ Verschure

The mechanisms of how the brain orchestrates multi-limb joint action have yet to be elucidated and few computational sensorimotor (SM) learning approaches have dealt with the problem of acquiring bimanual affordances. We propose a series of bidirectional (forward/inverse) SM maps and its associated learning processes that generalize from uni- to bimanual interaction (and affordances) naturally, reinforcing the motor equivalence property. The SM maps range from a SM nature to a solely sensory one: full body control, delta SM control (through small action changes), delta sensory co-variation (how body-related perceptual cues covariate with object-related ones). We make several contributions on how these SM maps are learned: (1) Context and Behavior-Based Babbling: generalizing goal babbling to the interleaving of absolute and local goals including guidance of reflexive behaviors; (2) Event-Based Learning: learning steps are driven by visual, haptic events; and (3) Affordance Gradients: the vectorial field gradients in which an object can be manipulated. Our modeling of bimanual affordances is in line with current robotic research in forward visuomotor mappings and visual servoing, enforces the motor equivalence property, and is also consistent with neurophysiological findings like the multiplicative encoding scheme.


2019 ◽  
Vol 122 (1) ◽  
pp. 5-21 ◽  
Author(s):  
M. Shoaibur Rahman ◽  
Jeffrey M. Yau

Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it scans across a surface. Although much is known about the processing of vibration amplitude and frequency information when cutaneous stimulation is experienced at a single location on the body, how these stimulus features are processed when touch occurs at multiple locations is poorly understood. We evaluated participants’ ability to discriminate tactile cues (100–300 Hz) on one hand while they ignored distractor cues experienced on their other hand. We manipulated the relative positions of the hands to characterize how limb position influenced cutaneous touch interactions. In separate experiments, participants judged either the frequency or intensity of mechanical vibrations. We found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed according to stimulus feature. Somatosensory interactions in intensity perception were only marked by attenuation that was invariant to hand position manipulations. In contrast, interactions in frequency perception consisted of both bias and sensitivity changes that were more pronounced when the hands were held in close proximity. We implemented models to infer the neural computations that mediate somatosensory interactions in the intensity and frequency dimensions. Our findings reveal obligatory and feature-dependent somatosensory interactions that may be supported by both feature-specific and feature-general operations. NEW & NOTEWORTHY Little is known about the neural computations mediating feature-specific sensory interactions between the hands. We show that vibrations experienced on one hand systematically modulate the perception of vibrations felt on the other hand. Critically, interaction patterns and their dependence on the relative positions of the hands differed depending on whether participants judged vibration intensity or frequency. These results, which we recapitulate with models, imply that somatosensory interactions are mediated by feature-dependent neural computations.


2016 ◽  
Vol 25 (1) ◽  
pp. 17-32 ◽  
Author(s):  
Merwan Achibet ◽  
Adrien Girard ◽  
Maud Marchal ◽  
Anatole Lécuyer

Haptic feedback is known to improve 3D interaction in virtual environments but current haptic interfaces remain complex and tailored to desktop interaction. In this paper, we describe an alternative approach called “Elastic-Arm” for incorporating haptic feedback in immersive virtual environments in a simple and cost-effective way. The Elastic-Arm is based on a body-mounted elastic armature that links the user's hand to the body and generates a progressive egocentric force when extending the arm. A variety of designs can be proposed with multiple links attached to various locations on the body in order to simulate different haptic properties and sensations such as different levels of stiffness, weight lifting, and bimanual interaction. Our passive haptic approach can be combined with various 3D interaction techniques and we illustrate the possibilities offered by the Elastic-Arm through several use cases based on well-known techniques such as the Bubble technique, redirected touching, and pseudo-haptics. A user study was conducted which showed the effectiveness of our pseudo-haptic technique as well as the general appreciation of the Elastic-Arm. We believe that the Elastic-Arm could be used in various VR applications which call for mobile haptic feedback or human-scale haptic sensations.


2015 ◽  
Vol 21 (12) ◽  
pp. 1377-1389 ◽  
Author(s):  
Gregory Hough ◽  
Ian Williams ◽  
Cham Athwal

Sign in / Sign up

Export Citation Format

Share Document