scholarly journals Selective attention gates the interactive crossmodal coupling between perceptual systems

2017 ◽  
Author(s):  
Silvia Convento ◽  
Md. Shoaibur Rahman ◽  
Jeffrey M. Yau

SummaryCortical sensory systems often activate in parallel, even when stimulation is experienced through a single sensory modality [1–3]. Critically, the functional relationship between co-activated cortical systems is unclear: Co-activations may reflect the interactive coupling between information-linked cortical systems or merely parallel but independent sensory processing. Here, we report causal evidence consistent with the hypothesis that human somatosensory cortex (S1), which co-activates with auditory cortex during the processing of vibrations and textures [4–9], interactively couples to cortical systems that support auditory perception. In a series of behavioural experiments, we used transcranial magnetic stimulation (TMS) to probe interactions between the somatosensory and auditory perceptual systems as we manipulated attention state. Acute manipulation of S1 activity using TMS impairs auditory frequency perception when subjects simultaneously attend to auditory and tactile frequency, but not when attention is directed to audition alone. Auditory frequency perception is unaffected by TMS over visual cortex thus confirming the privileged interactions between the somatosensory and auditory systems in temporal frequency processing [10–13]. Our results provide a key demonstration that selective attention can modulate the functional properties of cortical systems thought to support specific sensory modalities. The gating of crossmodal coupling by selective attention may critically support multisensory interactions and feature-specific perception.

Perception ◽  
10.1068/p2984 ◽  
2000 ◽  
Vol 29 (6) ◽  
pp. 745-754 ◽  
Author(s):  
Gail Martino ◽  
Lawrence E Marks

At each moment, we experience a melange of information arriving at several senses, and often we focus on inputs from one modality and ‘reject’ inputs from another. Does input from a rejected sensory modality modulate one's ability to make decisions about information from a selected one? When the modalities are vision and hearing, the answer is “yes”, suggesting that vision and hearing interact. In the present study, we asked whether similar interactions characterize vision and touch. As with vision and hearing, results obtained in a selective attention task show cross-modal interactions between vision and touch that depend on the synesthetic relationship between the stimulus combinations. These results imply that similar mechanisms may govern cross-modal interactions across sensory modalities.


2018 ◽  
Author(s):  
Lexi E. Crommett ◽  
Deeksha Madala ◽  
Jeffrey M Yau

Naturally occurring signals in audition and touch can be complex and marked by temporal variations in frequency and amplitude. Auditory frequency sweep processing has been studied extensively; however, much less is known about sweep processing in touch since studies have primarily focused on the perception of simple sinusoidal vibrations. Given the extensive interactions between audition and touch in the frequency processing of pure tone signals, we reasoned that these senses might also interact in the processing of higher-order frequency representations like sweeps. In a series of psychophysical experiments, we characterized the influence of auditory distractors on the ability of participants to discriminate tactile frequency sweeps. Auditory frequency sweeps systematically biased the tactile perception of sweep direction. Importantly, auditory cues exerted little influence on tactile sweep direction perception when the sounds and vibrations occupied different absolute frequency ranges or when the sounds consisted of intensity sweeps. Thus, audition and touch interact in frequency sweep perception in a frequency- and feature-specific manner. Our results demonstrate that audio-tactile interactions are not constrained to the processing of simple sinusoids. Because higher-order frequency representations may be synthesized from simpler representations, our findings imply that multisensory interactions in the temporal frequency domain span multiple hierarchical levels in sensory processing.


2019 ◽  
Vol 32 (1) ◽  
pp. 67-85 ◽  
Author(s):  
Silvia Convento ◽  
Kira A. Wegner-Clemens ◽  
Jeffrey M. Yau

Abstract In both audition and touch, sensory cues comprising repeating events are perceived either as a continuous signal or as a stream of temporally discrete events (flutter), depending on the events’ repetition rate. At high repetition rates (>100 Hz), auditory and tactile cues interact reciprocally in pitch processing. The frequency of a cue experienced in one modality systematically biases the perceived frequency of a cue experienced in the other modality. Here, we tested whether audition and touch also interact in the processing of low-frequency stimulation. We also tested whether multisensory interactions occurred if the stimulation in one modality comprised click trains and the stimulation in the other modality comprised amplitude-modulated signals. We found that auditory cues bias touch and tactile cues bias audition on a flutter discrimination task. Even though participants were instructed to attend to a single sensory modality and ignore the other cue, the flutter rate in the attended modality is perceived to be similar to that of the distractor modality. Moreover, we observed similar interaction patterns regardless of stimulus type and whether the same stimulus types were experienced by both senses. Combined with earlier studies, our results suggest that the nervous system extracts and combines temporal rate information from multisensory environmental signals, regardless of stimulus type, in both the low- and high temporal frequency domains. This function likely reflects the importance of temporal frequency as a fundamental feature of our multisensory experience.


2012 ◽  
Vol 25 (0) ◽  
pp. 6
Author(s):  
Christian Keitel ◽  
Erich Schröger ◽  
Matthias M. Müller

Attention has been conceptualized as a filter mechanism that suffices the fundamental capacity limitation of sensory processing by selecting the input of most behavioral relevance. Recent research has demonstrated that information can be selected by attention based on the sensory modality it is presented in. However, it is not yet known whether this ‘intermodal’ selection relies on common attentional resources. Alternatively, each sensory modality might draw on an independent pool of resources. The present work investigates the neural mechanisms of sustained intermodal attention by pitting the notions of common vs. modality-specific attentional resources against each other. To this end, concurrently presented, frequency-tagged auditory and visual stimuli elicited continuous electrophysiological brain responses in respective early sensory cortices. Three experiments probed (1) whether attention to a particular sensory modality results in a modality-specific modulation of processing, (2) whether this modulation of processing is a facilitation of the attended, an inhibition of the unattended sensory modality or a combination of both factors, and (3) whether stimuli of different sensory modalities enter a competition for processing, thus necessitating intermodal attention to rely on common attentional resources. Attentional modulation of stimulus processing was found to be modality-specific. This modulation likely involved two separate mechanisms: a facilitation of stimuli presented to attended sensory modalities that led to an inhibition of stimuli presented to unattended modalities. As a complementary result, stimuli were found to enter a competition for processing within but not between modalities. In conclusion, the present findings provide evidence for early sensory processing to rely on modality-specific rather than common attentional resources.


Author(s):  
Shani Haskal de la Zerda ◽  
Shai Netser ◽  
Hen Magalnik ◽  
Mayan Briller ◽  
Dan Marzan ◽  
...  

AbstractIn humans, discrimination between individuals, also termed social recognition, can rely on a single sensory modality, such as vision. By analogy, social recognition in rodents is thought to be based upon olfaction. Here, we hypothesized that social recognition in rodents relies upon integration of olfactory, auditory and somatosensory cues, hence requiring active behavior of social stimuli. Using distinct social recognition tests, we demonstrated that adult male rats and mice do not recognize familiar stimuli or learn the identity of novel stimuli that are inactive due to anesthesia. We further revealed that impairing the olfactory, somatosensory or auditory systems prevents recognition of familiar stimuli. Finally, we found that familiar and novel stimuli generate distinct movement patterns during social discrimination and that subjects react differentially to the movement of these stimuli. Thus, unlike what occurs in humans, social recognition in rats and mice relies on integration of information from several sensory modalities.


Author(s):  
Bruno and

Multisensory interactions in perception are pervasive and fundamental, as we have documented throughout this book. In this final chapter, we propose that contemporary work on multisensory processing is a paradigm shift in perception science, calling for a radical reconsideration of empirical and theoretical questions within an entirely new perspective. In making our case, we emphasize that multisensory perception is the norm, not the exception, and we remark that multisensory interactions can occur early in sensory processing. We reiterate the key notions that multisensory interactions come in different kinds and that principles of multisensory processing must be considered when tackling multisensory daily-life problems. We discuss the role of unisensory processing in a multisensory world, and we conclude by suggesting future directions for the multisensory field.


2021 ◽  
pp. 214-220
Author(s):  
Wei Lin Toh ◽  
Neil Thomas ◽  
Susan L. Rossell

There has been burgeoning interest in studying hallucinations in psychosis occurring across multiple sensory modalities. The current study aimed to characterize the auditory hallucination and delusion profiles in patients with auditory hallucinations only versus those with multisensory hallucinations. Participants with psychosis were partitioned into groups with voices only (AVH; <i>n</i> = 50) versus voices plus hallucinations in at least one other sensory modality (AVH+; <i>n</i> = 50), based on their responses on the Scale for the Assessment of Positive Symptoms (SAPS). Basic demographic and clinical information was collected, and the Questionnaire for Psychotic Experiences (QPE) was used to assess psychosis phenomenology. Relative to the AVH group, greater compliance to perceived commands, auditory illusions, and sensed presences was significantly elevated in the AVH+ group. The latter group also had greater levels of delusion-related distress and functional impairment and was more likely to endorse delusions of reference and misidentification. This preliminary study uncovered important phenomenological differences in those with multisensory hallucinations. Future hallucination research extending beyond the auditory modality is needed.


Perception ◽  
1989 ◽  
Vol 18 (6) ◽  
pp. 739-751 ◽  
Author(s):  
Christian Marendaz

Interindividual differences in field dependence—independence (FDI) which emerge in situations of vision—posture conflict when subjects are required to orient their bodies vertically were investigated. The first aim was to see whether the same interindividual differences are found in judgements of the orientation of forms in focal vision in which subjects have to deal with conflicting spatial references processed by different sensory modalities. The second aim was to test the idea that the FDI dimension is due to functional habits linked to balancing. Subjects performed Kopfermann's (1930) shape-orientation task in either a stable (experiment 1) or an unstable (experiment 2) postural condition. Results showed that the FDI dimension comes into play in the solution of the Kopfermann shape orientation task, and that there is an interactive link between FDI and postural balance, consistent with theoretical expectations. More generally, it appears that the ‘choice’ of a spatial reference system is the product of both individual and situational characteristics, and that the ‘vicariance’ (or inter-changeability) of the sensory systems dealing with gravitational upright is at the basis of this interaction.


2016 ◽  
Vol 14 (3) ◽  
pp. 21-31 ◽  
Author(s):  
O.B. Bogdashina

Synaesthesia — a phenomenon of perception, when stimulation of one sensory modality triggers a perception in one or more other sensory modalities. Synaesthesia is not uniform and can manifest itself in different ways. As the sensations and their interpretation vary in different periods of time, it makes it hard to study this phenom¬enon. The article presents the classification of different forms of synaesthesia, including sensory and cognitive; and bimodal and multimodal synaesthesia. Some synaesthetes have several forms and variants of synaesthesia, while others – just one form of it. Although synaesthesia is not specific to autism spectrum disorders, it is quite common among autistic individuals. The article deals with the most common forms of synaesthesia in autism, advantages and problems of synesthetic perception in children with autism spectrum disorders, and provides some advice to parents how to recognise synaesthesia in children with autism.


Author(s):  
Drew McRacken ◽  
Maddie Dyson ◽  
Kevin Hu

Over the past few decades, there has been a significant number of reports that suggested that reaction times for different sensory modalities were different – e.g., that visual reaction time was slower than tactile reaction time. A recent report by Holden and colleagues stated that (1) there has been a significant historic upward drift in reaction times reported in the literature, (2) that this drift or degradation in reaction times could be accounted for by inaccuracies in the methods used and (3) that these inaccurate methods led to inaccurate reporting of differences between visual and tactile based reaction time testing.  The Holden study utilized robotics (i.e., no human factors) to test visual and tactile reaction time methods but did not assess how individuals would perform on different sensory modalities.  This study utilized three different sensory modalities: visual, auditory, and tactile, to test reaction time. By changing the way in which the subjects were prompted and measuring subsequent reaction time, the impact of sensory modality could be analyzed. Reaction time testing for two sensory modalities, auditory and visual, were administered through an Arduino Uno microcontroller device, while tactile-based reaction time testing was administered with the Brain Gauge. A range of stimulus intensities was delivered for the reaction times delivered by each sensory modality. The average reaction time and reaction time variability was assessed and a trend could be identified for the reaction time measurements of each of the sensory modalities. Switching the sensory modality did not result in a difference in reaction time and it was concluded that this was due to the implementation of accurate circuitry used to deliver each test. Increasing stimulus intensity for each sensory modality resulted in faster reaction times. The results of this study confirm the findings of Holden and colleagues and contradict the results reported in countless studies that conclude that (1) reaction times are historically slower now than they were 50 years ago and (2) that there are differences in reaction times for different sensory modalities (vision, hearing, tactile). The implications of this are that utilization of accurate reaction time methods could have a significant impact on clinical outcomes and that many methods in current clinical use are basically perpetuating poor methods and wasting time and money of countless subjects or patients.


Sign in / Sign up

Export Citation Format

Share Document