scholarly journals Center of Pressure Feedback Modulates the Entrainment of Voluntary Sway to the Motion of a Visual Target

2019 ◽  
Vol 9 (19) ◽  
pp. 3952 ◽  
Author(s):  
Haralampos Sotirakis ◽  
Vassilia Hatzitaki ◽  
Victor Munoz-Martel ◽  
Lida Mademli ◽  
Adamantios Arampatzis

Visually guided weight shifting is widely employed in balance rehabilitation, but the underlying visuo-motor integration process leading to balance improvement is still unclear. In this study, we investigated the role of center of pressure (CoP) feedback on the entrainment of active voluntary sway to a moving visual target and on sway’s dynamic stability as a function of target predictability. Fifteen young and healthy adult volunteers (height 175 ± 7 cm, body mass 69 ± 12 kg, age 32 ± 5 years) tracked a vertically moving visual target by shifting their body weight antero-posteriorly under two target motion and feedback conditions, namely, predictable and less predictable target motion, with or without visual CoP feedback. Results revealed lower coherence, less gain, and longer phase lag when tracking the less predictable compared to the predictable target motion. Feedback did not affect CoP-target coherence, but feedback removal resulted in greater target overshooting and a shorter phase lag when tracking the less predictable target. These adaptations did not affect the dynamic stability of voluntary sway. It was concluded that CoP feedback improves spatial perception at the cost of time delays, particularly when tracking a less predictable moving target.

PLoS ONE ◽  
2016 ◽  
Vol 11 (3) ◽  
pp. e0151393 ◽  
Author(s):  
Michael W. Kennedy ◽  
Charles R. Crowell ◽  
Michael Villano ◽  
James P. Schmiedeler

Author(s):  
Afrizal Mayub ◽  
Fahmizal Fahmizal

This paper presents a sensor-based stability walk for bipedal robots by using force sensitive resistor (FSR) sensor. To perform walk stability on uneven terrain conditions, FSR sensor is used as feedbacks to evaluate the stability of bipedal robot instead of the center of pressure (CoP). In this work, CoP that was generated from four FSR sensors placed on each foot-pad is used to evaluate the walking stability. The robot CoP position provided an indication of walk stability. The CoP position information was further evaluated with a fuzzy logic controller (FLC) to generate appropriate offset angles to be applied to meet a stable situation. Moreover, in this paper designed a FLC through CoP region's stability and stable compliance control are introduced. Finally, the performances of the proposed methods were verified with 18-degrees of freedom (DOF) kid-size bipedal robot.<br /><br />


Perception ◽  
1981 ◽  
Vol 10 (2) ◽  
pp. 191-198 ◽  
Author(s):  
Edward R Strelow ◽  
John A Brabyn

The effects of reducing the range of spatial perception on the accuracy of visually guided locomotion were studied in two experiments. Limiting the range of perception to only near objects produces changes in the flow of stimulus detail and reduces opportunities for the appearance of an aiming point and for motion parallax. Such conditions were found to produce inferior performance compared to full vision, or to minimal background information. A defined aiming point was also found to assist control when no other background was present. The results are discussed with reference to theories of locomotor control and the design of artificial spatial sensing aids for the blind.


2003 ◽  
Vol 22 (3) ◽  
pp. 221-236 ◽  
Author(s):  
Mylène C Dault ◽  
Mirjam de Haart ◽  
Alexander C.H Geurts ◽  
Ilse M.P Arts ◽  
Bart Nienhuis

Author(s):  
Bartholomew Elias

The effects of a dynamic auditory preview display were examined in a visual target aiming task. A moving sound stimulus aligned with a visual target was presented over various distances beyond the bounds of a visual display. Results indicated reduced error magnitudes in aimed responses to visual targets with increasing auditory preview distance. In subsequent testing, the effects of position and velocity misalignments between the sound source and the visual target were assessed. In position misalignment conditions where the sound source lagged behind the visual target, higher error magnitudes were observed. However, when the auditory display preceded the visual target, performance improved. In velocity mismatch conditions, responses toward fast moving targets improved when a relatively faster sound source was previewed but were disrupted when a slower sound source was previewed. On the contrary, responses toward slow moving targets improved when a relatively slower sound source was previewed and were disrupted when a faster sound source was previewed.


2004 ◽  
Vol 11 (2-3) ◽  
pp. 371-399 ◽  
Author(s):  
Scott Johnson‐Frey ◽  
Michael McCarty ◽  
Rachel Keen

2004 ◽  
Vol 91 (4) ◽  
pp. 1620-1634 ◽  
Author(s):  
Myrka Zago ◽  
Gianfranco Bosco ◽  
Vincenzo Maffei ◽  
Marco Iosa ◽  
Yuri P. Ivanenko ◽  
...  

Prevailing views on how we time the interception of a moving object assume that the visual inputs are informationally sufficient to estimate the time-to-contact from the object's kinematics. Here we present evidence in favor of a different view: the brain makes the best estimate about target motion based on measured kinematics and an a priori guess about the causes of motion. According to this theory, a predictive model is used to extrapolate time-to-contact from expected dynamics (kinetics). We projected a virtual target moving vertically downward on a wide screen with different randomized laws of motion. In the first series of experiments, subjects were asked to intercept this target by punching a real ball that fell hidden behind the screen and arrived in synchrony with the visual target. Subjects systematically timed their motor responses consistent with the assumption of gravity effects on an object's mass, even when the visual target did not accelerate. With training, the gravity model was not switched off but adapted to nonaccelerating targets by shifting the time of motor activation. In the second series of experiments, there was no real ball falling behind the screen. Instead the subjects were required to intercept the visual target by clicking a mousebutton. In this case, subjects timed their responses consistent with the assumption of uniform motion in the absence of forces, even when the target actually accelerated. Overall, the results are in accord with the theory that motor responses evoked by visual kinematics are modulated by a prior of the target dynamics. The prior appears surprisingly resistant to modifications based on performance errors.


2004 ◽  
Vol 92 (3) ◽  
pp. 1867-1879 ◽  
Author(s):  
Scott A. Norris ◽  
Bradley Greger ◽  
Emily N. Hathaway ◽  
W. Thomas Thach

Complex (CS)- and simple-spike (SS) discharge from single Purkinje cells (Pc) in the posterolateral cerebellum of two monkeys was recorded during a visually guided reach-touch task. A visual target appeared (TA) off-gaze at a random location on a screen. On initiation of arm reach, the target disappeared, then reappeared (TR) after a fixed delay. TR was either at the same location (baseline condition) or a shifted location at a fixed distance and direction from TA location (shift condition). Across trials, we observed one or two peaks of CS activity, depending on the reach condition. The first CS (T1 CS) peak was tuned to the location of TA on the screen, following TA by ∼150 ms. The second CS (T2 CS) peak occurred only in the shift condition, was tuned to the shift location of TR, and followed TR by ∼150 ms. The locational preferences of T1 and T2 CS peaks were the same. T1 and T2 CSs preceded saccades to TA and TR at the preferred location and occurred during reaches with either arm. T1 CSs occurred during trials in which the target appeared, and there was a saccade to target, but no subsequent arm reach followed. SS firing varied with TA/TR in the same preferred location as for the accompanying CS. We conclude that posterolateral Pc CS and SS firing changes following an off-gaze visual target appearance in a preferred location when there is a subsequent saccade to that location.


Sign in / Sign up

Export Citation Format

Share Document