scholarly journals Perceptual decisions about object shape bias visuomotor coordination during rapid interception movements

2019 ◽  
Author(s):  
Deborah A. Barany ◽  
Ana Gómez-Granados ◽  
Margaret Schrayer ◽  
Sarah A. Cutts ◽  
Tarkeshwar Singh

AbstractVisual processing in parietal areas of the dorsal stream facilitates sensorimotor transformations for rapid movement. This action-related visual processing is hypothesized to play a distinct functional role from the perception-related processing in the ventral stream. However, it is unclear how the two streams interact when perceptual identification is a prerequisite to executing an accurate movement. In the current study, we investigated how perceptual decision-making involving the ventral stream influences arm and eye movement strategies. Participants (N = 26) moved a robotic manipulandum using right whole-arm movements to rapidly reach a stationary object or intercept a moving object on an augmented-reality display. On some blocks of trials, participants needed to identify the shape of the object (circle or ellipse) as a cue to either hit the object (circle) or move to a pre-defined location away from the object (ellipse). We found that during perceptual decision-making, there was an increased urgency to act during interception movements relative to reaching, which was associated with more decision errors. Faster hand reaction times were correlated with a strategy to adjust the movement post-initiation, and this strategy was more prominent during interception. Saccadic reaction times were faster and initial gaze lags and gains greater during decisions, suggesting that eye movements adapt to perceptual demands for guiding limb movements. Together, our findings suggest that the integration of ventral stream information with visuomotor planning depends on imposed (or perceived) task demands.New and NoteworthyVisual processing for perception and for action are thought to be mediated by two specialized neural pathways. Using a visuomotor decision-making task, we show that participants differentially utilized online perceptual decision-making in reaching and interception, and that eye movements necessary for perception influenced motor decision strategies. These results provide evidence that task complexity modulates how pathways processing perception versus action information interact during the visual control of movement.

2020 ◽  
Vol 123 (6) ◽  
pp. 2235-2248
Author(s):  
Deborah A. Barany ◽  
Ana Gómez-Granados ◽  
Margaret Schrayer ◽  
Sarah A. Cutts ◽  
Tarkeshwar Singh

Visual processing for perception and for action is thought to be mediated by two specialized neural pathways. Using a visuomotor decision-making task, we show that participants differentially utilized online perceptual decision-making in reaching and interception and that eye movements necessary for perception influenced motor decision strategies. These results provide evidence that task complexity modulates how pathways processing perception versus action information interact during the visual control of movement.


2016 ◽  
Vol 115 (2) ◽  
pp. 915-930 ◽  
Author(s):  
Matthew A. Carland ◽  
Encarni Marcos ◽  
David Thura ◽  
Paul Cisek

Perceptual decision making is often modeled as perfect integration of sequential sensory samples until the accumulated total reaches a fixed decision bound. In that view, the buildup of neural activity during perceptual decision making is attributed to temporal integration. However, an alternative explanation is that sensory estimates are computed quickly with a low-pass filter and combined with a growing signal reflecting the urgency to respond and it is the latter that is primarily responsible for neural activity buildup. These models are difficult to distinguish empirically because they make similar predictions for tasks in which sensory information is constant within a trial, as in most previous studies. Here we presented subjects with a variant of the classic constant-coherence motion discrimination (CMD) task in which we inserted brief motion pulses. We examined the effect of these pulses on reaction times (RTs) in two conditions: 1) when the CMD trials were blocked and subjects responded quickly and 2) when the same CMD trials were interleaved among trials of a variable-motion coherence task that motivated slower decisions. In the blocked condition, early pulses had a strong effect on RTs but late pulses did not, consistent with both models. However, when subjects slowed their decision policy in the interleaved condition, later pulses now became effective while early pulses lost their efficacy. This last result contradicts models based on perfect integration of sensory evidence and implies that motion signals are processed with a strong leak, equivalent to a low-pass filter with a short time constant.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Troy C. Dildine ◽  
Elizabeth A. Necka ◽  
Lauren Y. Atlas

AbstractSelf-report is the gold standard for measuring pain. However, decisions about pain can vary substantially within and between individuals. We measured whether self-reported pain is accompanied by metacognition and variations in confidence, similar to perceptual decision-making in other modalities. Eighty healthy volunteers underwent acute thermal pain and provided pain ratings followed by confidence judgments on continuous visual analogue scales. We investigated whether eye fixations and reaction time during pain rating might serve as implicit markers of confidence. Confidence varied across trials and increased confidence was associated with faster pain rating reaction times. The association between confidence and fixations varied across individuals as a function of the reliability of individuals’ association between temperature and pain. Taken together, this work indicates that individuals can provide metacognitive judgments of pain and extends research on confidence in perceptual decision-making to pain.


2019 ◽  
Vol 121 (6) ◽  
pp. 1977-1980 ◽  
Author(s):  
Alexander J. Simon ◽  
Jessica N. Schachtner ◽  
Courtney L. Gallen

A large body of work has investigated the effects of attention and expectation on early sensory processing to support decision making. In a recent paper published in The Journal of Neuroscience, Rungratsameetaweemana et al. (Rungratsameetaweemana N, Itthipuripat S, Salazar A, Serences JT. J Neurosci 38: 5632–5648, 2018) found that expectations driven by implicitly learned task regularities do not modulate neural markers of early visual processing. Here, we discuss these findings and propose several lines of follow-up analyses and experiments that could expand on these findings in the broader perceptual decision making literature.


2019 ◽  
Author(s):  
Tiasha Saha Roy ◽  
Bapun Giri ◽  
Arpita Saha Chowdhury ◽  
Satyaki Mazumder ◽  
Koel Das

AbstractUnderstanding how individuals utilize social information while making perceptual decisions and how it affects their decision confidence is crucial in a society. Till date, very little is known about perceptual decision making in humans under the influence of social cues and the associated neural mediators. The present study provides empirical evidence of how individuals get manipulated by social cues while performing a face/car identification task. Subjects were significantly influenced by what they perceived as decisions of other subjects while the cues in reality were manipulated independently from the stimulus. Subjects in general tend to increase their decision confidence when their individual decision and social cues coincide, while their confidence decreases when cues conflict with their individual judgments often leading to reversal of decision. Using a novel statistical model, it was possible to rank subjects based on their propensity to be influenced by social cues. This was subsequently corroborated by analysis of their neural data. Neural time series analysis revealed no significant difference in decision making using social cues in the early stages unlike neural expectation studies with predictive cues. Multivariate pattern analysis of neural data alludes to a potential role of frontal cortex in the later stages of visual processing which appeared to code the effect of social cues on perceptual decision making. Specifically medial frontal cortex seems to play a role in facilitating perceptual decision preceded by conflicting cues.


2020 ◽  
Author(s):  
Troy C. Dildine ◽  
Elizabeth A. Necka ◽  
Lauren Yvette Atlas

Self-report is the gold standard for measuring pain. However, decisions about pain can vary substantially within and between individuals. We measured whether self-reported pain is accompanied by metacognition and variations in confidence, similar to perceptual decision-making in other modalities. Eighty healthy volunteers underwent acute thermal pain and provided pain ratings followed by confidence judgments on continuous visual analogue scales. We investigated whether eye fixations and reaction time during pain rating might serve as implicit markers of confidence. Confidence varied across trials and increased confidence was associated with faster pain rating reaction times. The association between confidence and fixations varied across individuals as a function of the reliability of individuals’ association between temperature and pain. Taken together, this work indicates that individuals can provide metacognitive judgments of pain and extends research on confidence in perceptual decision-making to pain.


eLife ◽  
2019 ◽  
Vol 8 ◽  
Author(s):  
Jochem van Kempen ◽  
Gerard M Loughnane ◽  
Daniel P Newman ◽  
Simon P Kelly ◽  
Alexander Thiele ◽  
...  

The timing and accuracy of perceptual decision-making is exquisitely sensitive to fluctuations in arousal. Although extensive research has highlighted the role of various neural processing stages in forming decisions, our understanding of how arousal impacts these processes remains limited. Here we isolated electrophysiological signatures of decision-making alongside signals reflecting target selection, attentional engagement and motor output and examined their modulation as a function of tonic and phasic arousal, indexed by baseline and task-evoked pupil diameter, respectively. Reaction times were shorter on trials with lower tonic, and higher phasic arousal. Additionally, these two pupil measures were predictive of a unique set of EEG signatures that together represent multiple information processing steps of decision-making. Finally, behavioural variability associated with fluctuations in tonic and phasic arousal, indicative of neuromodulators acting on multiple timescales, was mediated by its effects on the EEG markers of attentional engagement, sensory processing and the variability in decision processing.


2021 ◽  
Author(s):  
Matthijs N Oude Lohuis ◽  
Jean L Pie ◽  
Pietro Marchesi ◽  
Jorrit S Montijn ◽  
Christiaan P J de Kock ◽  
...  

The transformation of sensory inputs into behavioral outputs is characterized by an interplay between feedforward and feedback operations in cortical hierarchies. Even in simple sensorimotor transformations, recurrent processing is often expressed in primary cortices in a late phase of the cortical response to sensory stimuli. This late phase is engaged by attention and stimulus complexity, and also encodes sensory-independent factors, including movement and report-related variables. However, despite its pervasiveness, the nature and function of late activity in perceptual decision-making remain unclear. We tested whether the function of late activity depends on the complexity of a sensory change-detection task. Complexity was based on increasing processing requirements for the same sensory stimuli. We found that the temporal window in which V1 is necessary for perceptual decision-making was extended when we increased task complexity, independently of the presented visual stimulus. This window overlapped with the emergence of report-related activity and decreased noise correlations in V1. The onset of these co-occurring activity patterns was time-locked to and preceded reaction time, and predicted the reduction in behavioral performance obtained by optogenetically silencing late V1 activity (>200 ms after stimulus onset), a result confirmed by a second multisensory task with different requirements. Thus, although early visual response components encode all sensory information necessary to solve the task, V1 is not simply relaying information to higher-order areas transforming it into behavioral responses. Rather, task complexity determines the temporal extension of a loop of recurrent activity, which overlaps with report-related activity and determines how perceptual decisions are built.


Sign in / Sign up

Export Citation Format

Share Document