scholarly journals Hand position increases visual processing for task irrelevant flankers.

2014 ◽  
Vol 14 (10) ◽  
pp. 1041-1041
Author(s):  
W. Bush ◽  
S. Vecera
2021 ◽  
pp. 174702182199003
Author(s):  
Andy J Kim ◽  
David S Lee ◽  
Brian A Anderson

Previously reward-associated stimuli have consistently been shown to involuntarily capture attention in the visual domain. Although previously reward-associated but currently task-irrelevant sounds have also been shown to interfere with visual processing, it remains unclear whether such stimuli can interfere with the processing of task-relevant auditory information. To address this question, we modified a dichotic listening task to measure interference from task-irrelevant but previously reward-associated sounds. In a training phase, participants were simultaneously presented with a spoken letter and number in different auditory streams and learned to associate the correct identification of each of three letters with high, low, and no monetary reward, respectively. In a subsequent test phase, participants were again presented with the same auditory stimuli but were instead instructed to report the number while ignoring spoken letters. In both the training and test phases, response time measures demonstrated that attention was biased in favour of the auditory stimulus associated with high value. Our findings demonstrate that attention can be biased towards learned reward cues in the auditory domain, interfering with goal-directed auditory processing.


2012 ◽  
Vol 24 (10) ◽  
pp. 2043-2056 ◽  
Author(s):  
Ayano Matsushima ◽  
Masaki Tanaka

Resistance to distraction is a key component of executive functions and is strongly linked to the prefrontal cortex. Recent evidence suggests that neural mechanisms exist for selective suppression of task-irrelevant information. However, neuronal signals related to selective suppression have not yet been identified, whereas nonselective surround suppression, which results from attentional enhancement for relevant stimuli, has been well documented. This study examined single neuron activities in the lateral PFC when monkeys covertly tracked one of randomly moving objects. Although many neurons responded to the target, we also found a group of neurons that exhibited a selective response to the distractor that was visually identical to the target. Because most neurons were insensitive to an additional distractor that explicitly differed in color from the target, the brain seemed to monitor the distractor only when necessary to maintain internal object segregation. Our results suggest that the lateral PFC might provide at least two top–down signals during covert object tracking: one for enhancement of visual processing for the target and the other for selective suppression of visual processing for the distractor. These signals might work together to discriminate objects, thereby regulating both the sensitivity and specificity of target choice during covert object tracking.


2019 ◽  
Vol 14 (7) ◽  
pp. 727-735 ◽  
Author(s):  
Annett Schirmer ◽  
Maria Wijaya ◽  
Esther Wu ◽  
Trevor B Penney

Abstract This pre-registered event-related potential study explored how vocal emotions shape visual perception as a function of attention and listener sex. Visual task displays occurred in silence or with a neutral or an angry voice. Voices were task-irrelevant in a single-task block, but had to be categorized by speaker sex in a dual-task block. In the single task, angry voices increased the occipital N2 component relative to neutral voices in women, but not men. In the dual task, angry voices relative to neutral voices increased occipital N1 and N2 components, as well as accuracy, in women and marginally decreased accuracy in men. Thus, in women, vocal anger produced a strong, multifaceted visual enhancement comprising attention-dependent and attention-independent processes, whereas in men, it produced a small, behavior-focused visual processing impairment that was strictly attention-dependent. In sum, these data indicate that attention and listener sex critically modulate whether and how vocal emotions shape visual perception.


2019 ◽  
Author(s):  
Buse M. Urgen ◽  
Huseyin Boyaci

AbstractExpectations and prior knowledge strongly affect and even shape our visual perception. Specifically, valid expectations speed up perceptual decisions, and determine what we see in a noisy stimulus. Bayesian models have been remarkably successful to capture the behavioral effects of expectation. On the other hand several more mechanistic neural models have also been put forward, which will be referred as “predictive computation models” here. Both Bayesian and predictive computation models treat perception as a probabilistic inference process, and combine prior information and sensory input. Despite the well-established effects of expectation on recognition or decision-making, its effects on low-level visual processing, and the computational mechanisms underlying those effects remain elusive. Here we investigate how expectations affect early visual processing at the threshold level. Specifically, we measured temporal thresholds (shortest duration of presentation to achieve a certain success level) for detecting the spatial location of an intact image, which could be either a house or a face image. Task-irrelevant cues provided prior information, thus forming an expectation, about the category of the upcoming intact image. The validity of the cue was set to 100, 75 and 50% in different experimental sessions. In a separate session the cue was neutral and provided no information about the category of the upcoming intact image. Our behavioral results showed that valid expectations do not reduce temporal thresholds, rather violation of expectation increases the thresholds specifically when the expectation validity is high. Next, we implemented a recursive Bayesian model, in which the prior is first set using the validity of the specific experimental condition, but in subsequent iterations it is updated using the posterior of the previous iteration. Simulations using the model showed that the observed increase of the temporal thresholds in the unexpected trials is not due to a change in the internal parameters of the system (e.g. decision threshold or internal uncertainty). Rather, further processing is required for a successful detection when the expectation and actual input disagree. These results reveal some surprising behavioral effects of expectation at the threshold level, and show that a simple parsimonious computational model can successfully predict those effects.


2012 ◽  
Vol 6 (1) ◽  
Author(s):  
Valerie Higenell ◽  
Brian J. White ◽  
Joshua R. Hwang ◽  
Douglas P. Munoz

The capture of covert spatial attention by salient visual events influences subsequent gaze behavior. A task irrelevant stimulus (cue) can reduce (Attention capture) or prolong (Inhi-bition of return) saccade reaction time to a subsequent target stimulus depending on the cue-target delay. Here we investigated the mechanisms that underlie the sensory-based account of AC/IOR by manipulating the visual processing stage where the cue and target interact. In Experiment 1, liquid crystal shutter goggles were used to test whether AC/IOR occur at a monocular versus binocular processing stage (before versus after signals from both eyes converge). In Experiment 2, we tested whether visual orientation selective mechanisms are critical for AC/IOR by using oriented “Gabor” stimuli. We found that the magnitude of AC and IOR was not different between monocular and interocular viewing conditions, or between iso- and ortho-oriented cue-target interactions. The results suggest that the visual mechanisms that contribute to AC/IOR arise at an orientation-independent binocular processing stage.


Author(s):  
Adrian Rivera-Rodriguez ◽  
Maxwell Sherwood ◽  
Ahren B. Fitzroy ◽  
Lisa D. Sanders ◽  
Nilanjana Dasgupta

AbstractThis study measured event-related brain potentials (ERPs) to test competing hypotheses regarding the effects of anger and race on early visual processing (N1, P2, and N2) and error recognition (ERN and Pe) during a sequentially primed weapon identification task. The first hypothesis was that anger would impair weapon identification in a biased manner by increasing attention and vigilance to, and decreasing recognition and inhibition of weapon identification errors following, task-irrelevant Black (compared to White) faces. Our competing hypothesis was that anger would facilitate weapon identification by directing attention toward task-relevant stimuli (i.e., objects) and away from task-irrelevant stimuli (i.e., race), and increasing recognition and inhibition of biased errors. Results partially supported the second hypothesis, in that anger increased early attention to faces but minimized attentional processing of race, and did not affect error recognition. Specifically, angry (vs. neutral) participants showed increased N1 to both Black and White faces, ablated P2 race effects, and topographically restricted N2 race effects. Additionally, ERN amplitude was unaffected by emotion, race, or object type. However, Pe amplitude was affected by object type (but not emotion or race), such that Pe amplitude was larger after the misidentification of harmless objects as weapons. Finally, anger slowed overall task performance, especially the correct identification of harmless objects, but did not impact task accuracy. Task performance speed and accuracy were unaffected by the race of the face prime. Implications are discussed.


2015 ◽  
Vol 27 (6) ◽  
pp. 1172-1179 ◽  
Author(s):  
Martin Paczynski ◽  
Adam M. Burton ◽  
Amishi P. Jha

Although it is well established that stress can disrupt complex cognitive functions, relatively little is known about how it influences visual processing, especially in terms of visual selective attention. In the current study, we used highly aversive images, taken from the International Affective Picture System, to induce acute, low-intensity stress while participants performed a visual discrimination task. Consistent with prior research, we found that anticipation of aversive stimuli increased overall amplitude of the N170, suggesting an increase in early sensory gain. More importantly, we found that stress disrupted visual selective attention. While in no-stress blocks, the amplitude of the face-sensitive N170 was higher when participants attended to faces rather than scenes in face–scene overlay images; this effect was absent under stress. This was because of an increase in N170 amplitude in the scene-attend condition under stress. We interpret these findings as suggesting that even low-intensity acute stress can impair participants' ability to filter out task-irrelevant information. We discuss our findings in relation to how even brief exposure to low-intensity stress may adversely impact both healthy and clinical populations.


Cephalalgia ◽  
2016 ◽  
Vol 36 (11) ◽  
pp. 1057-1076 ◽  
Author(s):  
Louise O'Hare ◽  
Paul B Hibbard

Background Migraine is a common neurological condition that often involves differences in visual processing. These sensory processing differences provide important information about the underlying causes of the condition, and for the development of treatments. Review of psychophysical literature Psychophysical experiments have shown consistent impairments in contrast sensitivity, orientation acuity, and the perception of global form and motion. They have also established that the addition of task-irrelevant visual noise has a greater effect, and that surround suppression, masking and adaptation are all stronger in migraine. Theoretical signal processing model We propose utilising an established model of visual processing, based on signal processing theory, to account for the behavioural differences seen in migraine. This has the advantage of precision and clarity, and generating clear, falsifiable predictions. Conclusion Increased effects of noise and differences in excitation and inhibition can account for the differences in migraine visual perception. Consolidating existing research and creating a unified, defined theoretical account is needed to better understand the disorder.


Sign in / Sign up

Export Citation Format

Share Document