scholarly journals When expectations are not met: unraveling the computational mechanisms underlying the effect of expectation on perceptual thresholds

2019 ◽  
Author(s):  
Buse M. Urgen ◽  
Huseyin Boyaci

AbstractExpectations and prior knowledge strongly affect and even shape our visual perception. Specifically, valid expectations speed up perceptual decisions, and determine what we see in a noisy stimulus. Bayesian models have been remarkably successful to capture the behavioral effects of expectation. On the other hand several more mechanistic neural models have also been put forward, which will be referred as “predictive computation models” here. Both Bayesian and predictive computation models treat perception as a probabilistic inference process, and combine prior information and sensory input. Despite the well-established effects of expectation on recognition or decision-making, its effects on low-level visual processing, and the computational mechanisms underlying those effects remain elusive. Here we investigate how expectations affect early visual processing at the threshold level. Specifically, we measured temporal thresholds (shortest duration of presentation to achieve a certain success level) for detecting the spatial location of an intact image, which could be either a house or a face image. Task-irrelevant cues provided prior information, thus forming an expectation, about the category of the upcoming intact image. The validity of the cue was set to 100, 75 and 50% in different experimental sessions. In a separate session the cue was neutral and provided no information about the category of the upcoming intact image. Our behavioral results showed that valid expectations do not reduce temporal thresholds, rather violation of expectation increases the thresholds specifically when the expectation validity is high. Next, we implemented a recursive Bayesian model, in which the prior is first set using the validity of the specific experimental condition, but in subsequent iterations it is updated using the posterior of the previous iteration. Simulations using the model showed that the observed increase of the temporal thresholds in the unexpected trials is not due to a change in the internal parameters of the system (e.g. decision threshold or internal uncertainty). Rather, further processing is required for a successful detection when the expectation and actual input disagree. These results reveal some surprising behavioral effects of expectation at the threshold level, and show that a simple parsimonious computational model can successfully predict those effects.

2021 ◽  
pp. 174702182199003
Author(s):  
Andy J Kim ◽  
David S Lee ◽  
Brian A Anderson

Previously reward-associated stimuli have consistently been shown to involuntarily capture attention in the visual domain. Although previously reward-associated but currently task-irrelevant sounds have also been shown to interfere with visual processing, it remains unclear whether such stimuli can interfere with the processing of task-relevant auditory information. To address this question, we modified a dichotic listening task to measure interference from task-irrelevant but previously reward-associated sounds. In a training phase, participants were simultaneously presented with a spoken letter and number in different auditory streams and learned to associate the correct identification of each of three letters with high, low, and no monetary reward, respectively. In a subsequent test phase, participants were again presented with the same auditory stimuli but were instead instructed to report the number while ignoring spoken letters. In both the training and test phases, response time measures demonstrated that attention was biased in favour of the auditory stimulus associated with high value. Our findings demonstrate that attention can be biased towards learned reward cues in the auditory domain, interfering with goal-directed auditory processing.


2020 ◽  
Vol 33 (4-5) ◽  
pp. 521-548
Author(s):  
Laura Cacciamani ◽  
Larisa Sheparovich ◽  
Molly Gibbons ◽  
Brooke Crowley ◽  
Kalynn E. Carpenter ◽  
...  

Abstract We often rely on our sense of vision for understanding the spatial location of objects around us. If vision cannot be used, one must rely on other senses, such as hearing and touch, in order to build spatial representations. Previous work has found evidence of a leftward spatial bias in visual and tactile tasks. In this study, we sought evidence of this leftward bias in a non-visual haptic object location memory task and assessed the influence of a task-irrelevant sound. In Experiment 1, blindfolded right-handed sighted participants used their non-dominant hand to haptically locate an object on the table, then used their dominant hand to place the object back in its original location. During placement, participants either heard nothing (no-sound condition) or a task-irrelevant repeating tone to the left, right, or front of the room. The results showed that participants exhibited a leftward placement bias on no-sound trials. On sound trials, this leftward bias was corrected; placements were faster and more accurate (regardless of the direction of the sound). One explanation for the leftward bias could be that participants were overcompensating their reach with the right hand during placement. Experiment 2 tested this explanation by switching the hands used for exploration and placement, but found similar results as Experiment 1. A third Experiment found evidence supporting the explanation that sound corrects the leftward bias by heightening attention. Together, these findings show that sound, even if task-irrelevant and semantically unrelated, can correct one’s tendency to place objects too far to the left.


2012 ◽  
Vol 24 (10) ◽  
pp. 2043-2056 ◽  
Author(s):  
Ayano Matsushima ◽  
Masaki Tanaka

Resistance to distraction is a key component of executive functions and is strongly linked to the prefrontal cortex. Recent evidence suggests that neural mechanisms exist for selective suppression of task-irrelevant information. However, neuronal signals related to selective suppression have not yet been identified, whereas nonselective surround suppression, which results from attentional enhancement for relevant stimuli, has been well documented. This study examined single neuron activities in the lateral PFC when monkeys covertly tracked one of randomly moving objects. Although many neurons responded to the target, we also found a group of neurons that exhibited a selective response to the distractor that was visually identical to the target. Because most neurons were insensitive to an additional distractor that explicitly differed in color from the target, the brain seemed to monitor the distractor only when necessary to maintain internal object segregation. Our results suggest that the lateral PFC might provide at least two top–down signals during covert object tracking: one for enhancement of visual processing for the target and the other for selective suppression of visual processing for the distractor. These signals might work together to discriminate objects, thereby regulating both the sensitivity and specificity of target choice during covert object tracking.


2019 ◽  
Vol 14 (7) ◽  
pp. 727-735 ◽  
Author(s):  
Annett Schirmer ◽  
Maria Wijaya ◽  
Esther Wu ◽  
Trevor B Penney

Abstract This pre-registered event-related potential study explored how vocal emotions shape visual perception as a function of attention and listener sex. Visual task displays occurred in silence or with a neutral or an angry voice. Voices were task-irrelevant in a single-task block, but had to be categorized by speaker sex in a dual-task block. In the single task, angry voices increased the occipital N2 component relative to neutral voices in women, but not men. In the dual task, angry voices relative to neutral voices increased occipital N1 and N2 components, as well as accuracy, in women and marginally decreased accuracy in men. Thus, in women, vocal anger produced a strong, multifaceted visual enhancement comprising attention-dependent and attention-independent processes, whereas in men, it produced a small, behavior-focused visual processing impairment that was strictly attention-dependent. In sum, these data indicate that attention and listener sex critically modulate whether and how vocal emotions shape visual perception.


Author(s):  
Jeremiah Oluwatosin Bandele ◽  
Moses Oluwafemi Onibonoje ◽  
Abisayo O. Aladeloba

In free space optical (FSO) communication systems limited by atmospheric turbulence, the use of non-adaptive decision thresholds to determine the transmitted bits results in bit error rate (BER) floors at high BER values in all turbulence regimes. Practically implementing an adaptive decision threshold that can properly track the fluctuations due to atmospheric turbulence is challenging, therefore, devising ways of optimising the non-adaptive decision threshold used by FSO designers is necessary. In this paper, the investigation of gain saturated pre-amplified FSO communication systems using non-adaptive decision thresholds in the presence of atmospheric turbulence, pointing errors (PEs), geometric spread (GS) and amplified spontaneous emission noise is carried out by applying analytical methods and Monte Carlo (MC) simulation techniques. System performance is carried out for various turbulence regimes, normalised beam widths, normalised PE standard deviations and small signal gains using fixed gain and gain saturated optical amplifiers (OAs). Results obtained show that in the presence of atmospheric turbulence, PE and GS, optimal BER performances are obtained with OA input powers higher than the internal saturation power of the OA. Also, by using high gain OAs and varying the decision threshold level, acceptable BER performances can be obtained in strong turbulence regimes with a non-adaptive decision threshold.


2009 ◽  
Vol 5 (2) ◽  
pp. 270-273 ◽  
Author(s):  
Szonya Durant ◽  
Johannes M Zanker

Illusory position shifts induced by motion suggest that motion processing can interfere with perceived position. This may be because accurate position representation is lost during successive visual processing steps. We found that complex motion patterns, which can only be extracted at a global level by pooling and segmenting local motion signals and integrating over time, can influence perceived position. We used motion-defined Gabor patterns containing motion-defined boundaries, which themselves moved over time. This ‘motion-defined motion’ induced position biases of up to 0.5°, much larger than has been found with luminance-defined motion. The size of the shift correlated with how detectable the motion-defined motion direction was, suggesting that the amount of bias increased with the magnitude of this complex directional signal. However, positional shifts did occur even when participants were not aware of the direction of the motion-defined motion. The size of the perceptual position shift was greatly reduced when the position judgement was made relative to the location of a static luminance-defined square, but not eliminated. These results suggest that motion-induced position shifts are a result of general mechanisms matching dynamic object properties with spatial location.


2014 ◽  
Vol 51 (6) ◽  
pp. 529-538 ◽  
Author(s):  
Verena C. Seibold ◽  
Bettina Rolke

2020 ◽  
Author(s):  
Hannah J Stewart ◽  
Dawei Shen ◽  
Nasim Sham ◽  
Claude Alain

AbstractSelective attention to sound object features such as pitch and location is associated with enhanced brain activity in ventral and dorsal streams, respectively. We examined the role of these pathways in involuntary orienting and conflict resolution using functional magnetic resonance imaging (fMRI). Participants were presented with two tones that may share, or not, the same non-spatial (frequency) or spatial (location) auditory features. In separate blocks of trials, participants were asked to attend to sound frequency or sound location and ignore the change in the task-irrelevant feature. In both attend-frequency and attend-location tasks, response times were slower when the task-irrelevant feature changed than when it stayed the same (involuntary orienting). This behavioural cost coincided with enhanced activity in the prefrontal cortex and superior temporal gyrus (STG). Conflict resolution was examined by comparing situations where the change in stimulus features was congruent (both features changed) and incongruent (only one feature changed). Participants were slower and less accurate for incongruent than congruent sound features. This congruency effect was associated with enhanced activity in the prefrontal cortex, and was greater in the right STG and medial frontal cortex during the attend-location than during the attend-frequency task. Together, these findings do not support a strict division of ‘labour’ into ventral and dorsal streams, but rather suggest interactions between these pathways in situations involving changes in task-irrelevant sound feature and conflict resolution. These findings also validate the Test of Attention in Listening task by revealing distinct neural correlates for involuntary orienting and conflict resolution.


Sign in / Sign up

Export Citation Format

Share Document