scholarly journals Seeing versus Knowing: The Temporal Dynamics of Real and Implied Colour Processing in the Human Brain

2018 ◽  
Author(s):  
Lina Teichmann ◽  
Tijl Grootswagers ◽  
Thomas Carlson ◽  
Anina N. Rich

AbstractColour is a defining feature of many objects, playing a crucial role in our ability to rapidly recognise things in the world around us and make categorical distinctions. For example, colour is a useful cue when distinguishing lemons from limes or blackberries from raspberries. That means our representation of many objects includes key colour-related information. The question addressed here is whether the neural representation activated byknowingthat something is red is the same as that activated when weactually seesomething red, particularly in regard to timing. We addressed this question using neural timeseries (magnetoencephalography, MEG) data to contrast real colour perception and implied object colour activation. We applied multivariate pattern analysis (MVPA) to analyse the brain activationpatternsevoked by colour accessed via real colour perception and implied colour activation. Applying MVPA to MEG data allows us here to focus on the temporal dynamics of these processes. Male and female human participants (N=18) viewed isoluminant red and green shapes and grey-scale, luminance-matched pictures of fruits and vegetables that are red (e.g., tomato) or green (e.g., kiwifruit) in nature. We show that the brain activation pattern evoked by real colour perception is similar to implied colour activation, but that this pattern is instantiated at a later time. These results suggest that a common colour representation can be triggered by activating object representations from memory and perceiving colours.

Brain ◽  
1996 ◽  
Vol 119 (4) ◽  
pp. 1263-1276 ◽  
Author(s):  
R. Vandenberghe ◽  
P. Dupont ◽  
B. D. Bruyn ◽  
G. Bormans ◽  
J. Michicls ◽  
...  

2014 ◽  
Vol 27 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Hanani Abdul Manan ◽  
Elizabeth A. Franz ◽  
Ahmad Nazlim Yusoff ◽  
Siti Zamratol-Mai Sarah Mukari

2017 ◽  
Vol 12 (5) ◽  
pp. 812 ◽  
Author(s):  
SungHo Jang ◽  
Youn-Hee Choi ◽  
WooHyuk Jang ◽  
Sang-Uk Im ◽  
Keun-Bae Song ◽  
...  

Author(s):  
Hamid Karimi-Rouzbahani ◽  
Mozhgan Shahmohammadi ◽  
Ehsan Vahab ◽  
Saeed Setayeshi ◽  
Thomas Carlson

AbstractHumans are remarkably efficent at recognizing objects. Understanding how the brain performs object recognition has been challenging. Our understanding has been advanced substantially in recent years with the development of multivariate decoding methods. Most start-of-the-art decoding procedures, make use of the ‘mean’ neural activation to extract object category information, which overlooks temporal variability in the signals. Here, we studied category-related information in 30 mathematically distinct features from electroencephalography (EEG) across three independent and highly-varied datasets using multivariate decoding. While the event-related potential (ERP) components of N1 and P2a were among the most informative features, the informative original signal samples and Wavelet coefficients, selected through principal component analysis, outperformed them. The four mentioned informative features showed more pronounced decoding in the Theta frequency band, which has been suggested to support feed-forward processing of visual information in the brain. Correlational analyses showed that the features, which were most informative about object categories, could predict participants’ behavioral performance (reaction time) more accurately than the less informative features. These results suggest a new approach for studying how the human brain encodes object category information and how we can read them out more optimally to investigate the temporal dynamics of the neural code. The codes are available online at https://osf.io/wbvpn/.


2017 ◽  
Author(s):  
Tijl Grootswagers ◽  
Briana L. Kennedy ◽  
Steven B. Most ◽  
Thomas A. Carlson

AbstractHow is emotion represented in the brain: is it categorical or along dimensions? In the present study, we applied multivariate pattern analysis (MVPA) to magnetoencephalography (MEG) to study the brain’s temporally unfolding representations of different emotion constructs. First, participants rated 525 images on the dimensions of valence and arousal and by intensity of discrete emotion categories (happiness, sadness, fear, disgust, and sadness). Thirteen new participants then viewed subsets of these images within an MEG scanner. We used Representational Similarity Analysis (RSA) to compare behavioral ratings to the unfolding neural representation of the stimuli in the brain. Ratings of valence and arousal explained significant proportions of the MEG data, even after corrections for low-level image properties. Additionally, behavioral ratings of the discrete emotions fear, disgust, and happiness significantly predicted early neural representations, whereas rating models of anger and sadness did not. Different emotion constructs also showed unique temporal signatures. Fear and disgust – both highly arousing and negative – were rapidly discriminated by the brain, but disgust was represented for an extended period of time relative to fear. Overall, our findings suggest that 1) dimensions of valence and arousal are quickly represented by the brain, as are some discrete emotions, and 2) different emotion constructs exhibit unique temporal dynamics. We discuss implications of these findings for theoretical understanding of emotion and for the interplay of discrete and dimensional aspects of emotional experience.


2019 ◽  
Vol 121 (5) ◽  
pp. 1588-1590 ◽  
Author(s):  
Luca Casartelli

Neural, oscillatory, and computational counterparts of multisensory processing remain a crucial challenge for neuroscientists. Converging evidence underlines a certain efficiency in balancing stability and flexibility of sensory sampling, supporting the general idea that multiple parallel and hierarchically organized processing stages in the brain contribute to our understanding of the (sensory/perceptual) world. Intriguingly, how temporal dynamics impact and modulate multisensory processes in our brain can be investigated benefiting from studies on perceptual illusions.


2015 ◽  
Vol 370 (1668) ◽  
pp. 20140170 ◽  
Author(s):  
Riitta Hari ◽  
Lauri Parkkonen

We discuss the importance of timing in brain function: how temporal dynamics of the world has left its traces in the brain during evolution and how we can monitor the dynamics of the human brain with non-invasive measurements. Accurate timing is important for the interplay of neurons, neuronal circuitries, brain areas and human individuals. In the human brain, multiple temporal integration windows are hierarchically organized, with temporal scales ranging from microseconds to tens and hundreds of milliseconds for perceptual, motor and cognitive functions, and up to minutes, hours and even months for hormonal and mood changes. Accurate timing is impaired in several brain diseases. From the current repertoire of non-invasive brain imaging methods, only magnetoencephalography (MEG) and scalp electroencephalography (EEG) provide millisecond time-resolution; our focus in this paper is on MEG. Since the introduction of high-density whole-scalp MEG/EEG coverage in the 1990s, the instrumentation has not changed drastically; yet, novel data analyses are advancing the field rapidly by shifting the focus from the mere pinpointing of activity hotspots to seeking stimulus- or task-specific information and to characterizing functional networks. During the next decades, we can expect increased spatial resolution and accuracy of the time-resolved brain imaging and better understanding of brain function, especially its temporal constraints, with the development of novel instrumentation and finer-grained, physiologically inspired generative models of local and network activity. Merging both spatial and temporal information with increasing accuracy and carrying out recordings in naturalistic conditions, including social interaction, will bring much new information about human brain function.


Sign in / Sign up

Export Citation Format

Share Document