scholarly journals Fast decoding of natural object categories from intracranial field potentials in monkey's visual cortex

2010 ◽  
Vol 10 (7) ◽  
pp. 947-947
Author(s):  
M. Cauchoix ◽  
T. Serre ◽  
G. Kreiman ◽  
D. Fize
Neuron ◽  
2009 ◽  
Vol 62 (2) ◽  
pp. 281-290 ◽  
Author(s):  
Hesheng Liu ◽  
Yigal Agam ◽  
Joseph R. Madsen ◽  
Gabriel Kreiman

2015 ◽  
Vol 27 (11) ◽  
pp. 2117-2125 ◽  
Author(s):  
Reshanne R. Reeder ◽  
Francesca Perini ◽  
Marius V. Peelen

Theories of visual selective attention propose that top–down preparatory attention signals mediate the selection of task-relevant information in cluttered scenes. Neuroimaging and electrophysiology studies have provided correlative evidence for this hypothesis, finding increased activity in target-selective neural populations in visual cortex in the period between a search cue and target onset. In this study, we used online TMS to test whether preparatory neural activity in visual cortex is causally involved in naturalistic object detection. In two experiments, participants detected the presence of object categories (cars, people) in a diverse set of photographs of real-world scenes. TMS was applied over a region in posterior temporal cortex identified by fMRI as carrying category-specific preparatory activity patterns. Results showed that TMS applied over posterior temporal cortex before scene onset (−200 and −100 msec) impaired the detection of object categories in subsequently presented scenes, relative to vertex and early visual cortex stimulation. This effect was specific to category level detection and was related to the type of attentional template participants adopted, with the strongest effects observed in participants adopting category level templates. These results provide evidence for a causal role of preparatory attention in mediating the detection of objects in cluttered daily-life environments.


2017 ◽  
Vol 117 (1) ◽  
pp. 388-402 ◽  
Author(s):  
Michael A. Cohen ◽  
George A. Alvarez ◽  
Ken Nakayama ◽  
Talia Konkle

Visual search is a ubiquitous visual behavior, and efficient search is essential for survival. Different cognitive models have explained the speed and accuracy of search based either on the dynamics of attention or on similarity of item representations. Here, we examined the extent to which performance on a visual search task can be predicted from the stable representational architecture of the visual system, independent of attentional dynamics. Participants performed a visual search task with 28 conditions reflecting different pairs of categories (e.g., searching for a face among cars, body among hammers, etc.). The time it took participants to find the target item varied as a function of category combination. In a separate group of participants, we measured the neural responses to these object categories when items were presented in isolation. Using representational similarity analysis, we then examined whether the similarity of neural responses across different subdivisions of the visual system had the requisite structure needed to predict visual search performance. Overall, we found strong brain/behavior correlations across most of the higher-level visual system, including both the ventral and dorsal pathways when considering both macroscale sectors as well as smaller mesoscale regions. These results suggest that visual search for real-world object categories is well predicted by the stable, task-independent architecture of the visual system. NEW & NOTEWORTHY Here, we ask which neural regions have neural response patterns that correlate with behavioral performance in a visual processing task. We found that the representational structure across all of high-level visual cortex has the requisite structure to predict behavior. Furthermore, when directly comparing different neural regions, we found that they all had highly similar category-level representational structures. These results point to a ubiquitous and uniform representational structure in high-level visual cortex underlying visual object processing.


2019 ◽  
Vol 31 (10) ◽  
pp. 1563-1572 ◽  
Author(s):  
Clayton Hickey ◽  
Daniele Pollicino ◽  
Giacomo Bertazzoli ◽  
Ludwig Barbaro

People are quicker to detect examples of real-world object categories in natural scenes than is predicted by classic attention theories. One explanation for this puzzle suggests that experience renders the visual system sensitive to midlevel features diagnosing target presence. These are detected without the need for spatial attention, much as occurs for targets defined by low-level features like color or orientation. The alternative is that naturalistic search relies on spatial attention but is highly efficient because global scene information can be used to quickly reject nontarget objects and locations. Here, we use ERPs to differentiate between these possibilities. Results show that hallmark evidence of ultrafast target detection in frontal brain activity is preceded by an index of spatially specific distractor suppression in visual cortex. Naturalistic search for heterogenous targets therefore appears to rely on spatial operations that act on neural object representations, as predicted by classic attention theory. People appear able to rapidly reject nontarget objects and locations, consistent with the idea that global scene information is used to constrain naturalistic search and increase search efficiency.


2019 ◽  
Vol 122 (4) ◽  
pp. 1634-1648 ◽  
Author(s):  
Benjamin Fischer ◽  
Andreas Schander ◽  
Andreas K. Kreiter ◽  
Walter Lang ◽  
Detlef Wegener

Recordings of epidural field potentials (EFPs) allow neuronal activity to be acquired over a large region of cortical tissue with minimal invasiveness. Because electrodes are placed on top of the dura and do not enter the neuronal tissue, EFPs offer intriguing options for both clinical and basic science research. On the other hand, EFPs represent the integrated activity of larger neuronal populations and possess a higher trial-by-trial variability and a reduced signal-to-noise ratio due the additional barrier of the dura. It is thus unclear whether and to what extent EFPs have sufficient spatial selectivity to allow for conclusions about the underlying functional cortical architecture, and whether single EFP trials provide enough information on the short timescales relevant for many clinical and basic neuroscience purposes. We used the high spatial resolution of primary visual cortex to address these issues and investigated the extent to which very short EFP traces allow reliable decoding of spatial information. We briefly presented different visual objects at one of nine closely adjacent locations and recorded neuronal activity with a high-density epidural multielectrode array in three macaque monkeys. With the use of receiver operating characteristics (ROC) to identify the most informative data, machine-learning algorithms provided close-to-perfect classification rates for all 27 stimulus conditions. A binary classifier applying a simple max function on ROC-selected data further showed that single trials might be classified with 100% performance even without advanced offline classifiers. Thus, although highly variable, EFPs constitute an extremely valuable source of information and offer new perspectives for minimally invasive recording of large-scale networks. NEW & NOTEWORTHY Epidural field potential (EFP) recordings provide a minimally invasive approach to investigate large-scale neural networks, but little is known about whether they possess the required specificity for basic and clinical neuroscience. By making use of the spatial selectivity of primary visual cortex, we show that single-trial information can be decoded with close-to-perfect performance, even without using advanced classifiers and based on very few data. This labels EFPs as a highly attractive and widely usable signal.


Neuron ◽  
2009 ◽  
Vol 61 (1) ◽  
pp. 35-41 ◽  
Author(s):  
Steffen Katzner ◽  
Ian Nauhaus ◽  
Andrea Benucci ◽  
Vincent Bonin ◽  
Dario L. Ringach ◽  
...  

2010 ◽  
Vol 9 (8) ◽  
pp. 740-740
Author(s):  
F. A. Khawaja ◽  
J. M. G. Tsui ◽  
C. C. Pack

Sign in / Sign up

Export Citation Format

Share Document