scholarly journals The Similarity between Target and Nontarget Affects Different Processing Stages Depending on Stimulus Feature Dimensions: An ERP Study

Author(s):  
Yumiko Fujii ◽  
Hiromi Morita ◽  
Yuji Takeda
Keyword(s):  
2017 ◽  
Vol 284 (1867) ◽  
pp. 20172035 ◽  
Author(s):  
Jason Samaha ◽  
Bradley R. Postle

Adaptive behaviour depends on the ability to introspect accurately about one's own performance. Whether this metacognitive ability is supported by the same mechanisms across different tasks is unclear. We investigated the relationship between metacognition of visual perception and metacognition of visual short-term memory (VSTM). Experiments 1 and 2 required subjects to estimate the perceived or remembered orientation of a grating stimulus and rate their confidence. We observed strong positive correlations between individual differences in metacognitive accuracy between the two tasks. This relationship was not accounted for by individual differences in task performance or average confidence, and was present across two different metrics of metacognition and in both experiments. A model-based analysis of data from a third experiment showed that a cross-domain correlation only emerged when both tasks shared the same task-relevant stimulus feature. That is, metacognition for perception and VSTM were correlated when both tasks required orientation judgements, but not when the perceptual task was switched to require contrast judgements. In contrast with previous results comparing perception and long-term memory, which have largely provided evidence for domain-specific metacognitive processes, the current findings suggest that metacognition of visual perception and VSTM is supported by a domain-general metacognitive architecture, but only when both domains share the same task-relevant stimulus feature.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Helen Feigin ◽  
Shira Baror ◽  
Moshe Bar ◽  
Adam Zaidel

AbstractPerceptual decisions are biased by recent perceptual history—a phenomenon termed 'serial dependence.' Here, we investigated what aspects of perceptual decisions lead to serial dependence, and disambiguated the influences of low-level sensory information, prior choices and motor actions. Participants discriminated whether a brief visual stimulus lay to left/right of the screen center. Following a series of biased ‘prior’ location discriminations, subsequent ‘test’ location discriminations were biased toward the prior choices, even when these were reported via different motor actions (using different keys), and when the prior and test stimuli differed in color. By contrast, prior discriminations about an irrelevant stimulus feature (color) did not substantially influence subsequent location discriminations, even though these were reported via the same motor actions. Additionally, when color (not location) was discriminated, a bias in prior stimulus locations no longer influenced subsequent location discriminations. Although low-level stimuli and motor actions did not trigger serial-dependence on their own, similarity of these features across discriminations boosted the effect. These findings suggest that relevance across perceptual decisions is a key factor for serial dependence. Accordingly, serial dependence likely reflects a high-level mechanism by which the brain predicts and interprets new incoming sensory information in accordance with relevant prior choices.


2018 ◽  
Author(s):  
Md. Shoaibur Rahman ◽  
Jeffrey M. Yau

Bimanual touch may require combining what is felt on the hands with where the hands are located in space. The computations supporting bimanual touch are poorly understood. We found that tactile cue combination patterns and their sensitivity to the locations of the hands differed according to the attended stimulus feature. These idiosyncratic perceptual patterns can be explained by distinct cue combination models that each involve divisive normalization, a canonical computation.


2021 ◽  
pp. 095679762110242
Author(s):  
Chang-Yuan Lee ◽  
Carey K. Morewedge

We introduce a theoretical framework distinguishing between anchoring effects, anchoring bias, and judgmental noise: Anchoring effects require anchoring bias, but noise modulates their size. We tested this framework by manipulating stimulus magnitudes. As magnitudes increase, psychophysical noise due to scalar variability widens the perceived range of plausible values for the stimulus. This increased noise, in turn, increases the influence of anchoring bias on judgments. In 11 preregistered experiments ( N = 3,552 adults), anchoring effects increased with stimulus magnitude for point estimates of familiar and novel stimuli (e.g., reservation prices for hotels and donuts, counts in dot arrays). Comparisons of relevant and irrelevant anchors showed that noise itself did not produce anchoring effects. Noise amplified anchoring bias. Our findings identify a stimulus feature predicting the size and replicability of anchoring effects—stimulus magnitude. More broadly, we show how to use psychophysical noise to test relationships between bias and noise in judgment under uncertainty.


2016 ◽  
Author(s):  
Stephane Deny ◽  
Ulisse Ferrari ◽  
Emilie Mace ◽  
Pierre Yger ◽  
Romain Caplette ◽  
...  

AbstractIn the early visual system, cells of the same type perform the same computation in di↵erent places of the visual field. How these cells code together a complex visual scene is unclear. A common assumption is that cells of the same type will extract a single stimulus feature to form a feature map, but this has rarely been observed directly. Using large-scale recordings in the rat retina, we show that a homogeneous population of fast OFF ganglion cells simultaneously encodes two radically different features of a visual scene. Cells close to a moving object code linearly for its position, while distant cells remain largely invariant to the object’s position and, instead, respond non-linearly to changes in the object’s speed. Cells switch between these two computations depending on the stimulus. We developed a quantitative model that accounts for this effect and identified a likely disinhibitory circuit that mediates it. Ganglion cells of a single type thus do not code for one, but two features simultaneously. This richer, flexible neural map might also be present in other sensory systems.


2020 ◽  
Author(s):  
Long Luu ◽  
Alan A. Stocker

AbstractCategorical judgments can systematically bias the perceptual interpretation of stimulus features. However, it remained unclear whether categorical judgments directly modify working memory representations or, alternatively, generate these biases via an inference process down-stream from working memory. To address this question we ran two novel psychophysical experiments in which human subjects had to revert their categorical judgments about a stimulus feature, if incorrect based on feedback, before providing an estimate of the feature. If categorical judgments indeed directly altered sensory representations in working memory, subjects’ estimates should reflect some aspects of their initial (incorrect) categorical judgment in those trials.We found no traces of the initial categorical judgment. Rather, subjects seem to be able to flexibly switch their categorical judgment if needed and use the correct corresponding categorical prior to properly perform feature inference. A cross-validated model comparison also revealed that feedback may lead to selective memory recall such that only memory samples that are consistent with the categorical judgment are accepted for the inference process. Our results suggest that categorical judgments do not modify sensory information in working memory but rather act as top-down expectation in the subsequent sensory recall and inference process down-stream from working memory.


2017 ◽  
Author(s):  
Jason Samaha ◽  
Bradley R. Postle

AbstractAdaptive behavior depends on the ability to accurately introspect about one’s own performance. Whether this metacognitive ability is supported by the same mechanisms across different tasks has thus far been investigated with a focus on correlating metacognitive accuracy between perception and long-term memory paradigms. Here, we investigated the relationship between metacognition of visual perception and metacognition of visual short-term memory (VSTM), a cognitive function thought to be more intimately related to visual processing. Experiments 1 and 2 required subjects to estimate the perceived or remembered orientation of a grating stimulus and rate their confidence. We observed strong positive correlations between individual differences in metacognitive accuracy between the two tasks. This relationship was not accounted for by individual differences in task performance or average confidence, and was present across two different metrics of metacognition and in both experiments. A model-based analysis of data from a third experiment showed that a cross-domain correlation only emerged when both tasks shared the same task-relevant stimulus feature. That is, metacognition for perception and VSTM were correlated when both tasks required orientation judgments, but not when the perceptual task was switched to require contrast judgments. In contrast to previous results comparing perception and long-term memory, which have largely provided evidence for domain-specific metacognitive processes, the current findings suggest that metacognition of visual perception and VSTM is supported by a domain-general metacognitive architecture, but only when both domains share the same task-relevant stimulus feature.


2018 ◽  
Vol 44 (5) ◽  
pp. 767-777 ◽  
Author(s):  
Renning Hao ◽  
Mark W. Becker ◽  
Chaoxiong Ye ◽  
Qiang Liu ◽  
Taosheng Liu

1998 ◽  
Vol 10 (5) ◽  
pp. 605-614 ◽  
Author(s):  
Walter Ritter ◽  
Hilary Gomes ◽  
Nelson Cowan ◽  
Elyse Sussman ◽  
Herbert G. Vaughan

Research with the mismatch negativity component of event-related potentials has uncovered a system that detects change in the acoustic environment on an automatic basis. The system is considered to compare incoming stimuli to representations of the past and to emit an MMN if change is detected. Previous investigations have shown that the relevant memory of the past can become dormant and then be reactivated by a reminder stimulus. It is unclear, however, whether what is reactivated is an holistic representation of stimuli or separate representations of features of stimuli. The present study provides data that supports the latter possibility but leaves open the former one.


Sign in / Sign up

Export Citation Format

Share Document