scholarly journals Situational Determinants of Hand-Proximity Effects

2019 ◽  
Vol 5 (1) ◽  
Author(s):  
Tony Thomas ◽  
Meera Mary Sunny

Recent studies have demonstrated altered visual processing of stimuli in the proximal region of the hand. It has been challenging to characterize the range and nature of these processing differences. In our attempt to deconstruct the factors giving rise to the Hand-Proximity Effects (HPEs), we manipulated the organization of items in a visual search display. In two experiments, we observed the absence of HPE. Specifically, in Experiment 1, we presented the search display in only one half of the monitor (split diagonally), which could be either near or far from the hand placed on the corner of the monitor. The results of a Bayesian analysis showed that the search efficiency was not significantly different for neither ‘near’ nor ‘far’ condition when compared with the baseline condition in which the hand rested on the lap. In Experiment 2, the search display was arranged horizontally across the monitor. A Bayesian analysis showed that RTs did not vary depending on the proximity of the target to the hand as well as the baseline (lap) condition. The present results characterize features of the HPE that have not been reported previously and are in line with recent reports of the failure to replicate HPE under various circumstances.

2019 ◽  
Vol 31 (7) ◽  
pp. 1079-1090 ◽  
Author(s):  
Peter S. Whitehead ◽  
Mathilde M. Ooi ◽  
Tobias Egner ◽  
Marty G. Woldorff

The contents of working memory (WM) guide visual attention toward matching features, with visual search being faster when the target and a feature of an item held in WM spatially overlap (validly cued) than when they occur at different locations (invalidly cued). Recent behavioral studies have indicated that attentional capture by WM content can be modulated by cognitive control: When WM cues are reliably helpful to visual search (predictably valid), capture is enhanced, but when reliably detrimental (predictably invalid), capture is attenuated. The neural mechanisms underlying this effect are not well understood, however. Here, we leveraged the high temporal resolution of ERPs time-locked to the onset of the search display to determine how and at what processing stage cognitive control modulates the search process. We manipulated predictability by grouping trials into unpredictable (50% valid/invalid) and predictable (100% valid, 100% invalid) blocks. Behavioral results confirmed that predictability modulated WM-related capture. Comparison of ERPs to the search arrays showed that the N2pc, a posteriorly distributed signature of initial attentional orienting toward a lateralized target, was not impacted by target validity predictability. However, a longer latency, more anterior, lateralized effect—here, termed the “contralateral attention-related negativity”—was reduced under predictable conditions. This reduction interacted with validity, with substantially greater reduction for invalid than valid trials. These data suggest cognitive control over attentional capture by WM content does not affect the initial attentional-orienting process but can reduce the need to marshal later control mechanisms for processing relevant items in the visual world.


2007 ◽  
Author(s):  
Elizabeth A. Krupinski ◽  
Hans Roehrig ◽  
Jiahua Fan

2008 ◽  
Vol 19 (2) ◽  
pp. 128-136 ◽  
Author(s):  
Geoffrey F. Woodman ◽  
Min-Suk Kang ◽  
Kirk Thompson ◽  
Jeffrey D. Schall

2017 ◽  
Vol 117 (1) ◽  
pp. 388-402 ◽  
Author(s):  
Michael A. Cohen ◽  
George A. Alvarez ◽  
Ken Nakayama ◽  
Talia Konkle

Visual search is a ubiquitous visual behavior, and efficient search is essential for survival. Different cognitive models have explained the speed and accuracy of search based either on the dynamics of attention or on similarity of item representations. Here, we examined the extent to which performance on a visual search task can be predicted from the stable representational architecture of the visual system, independent of attentional dynamics. Participants performed a visual search task with 28 conditions reflecting different pairs of categories (e.g., searching for a face among cars, body among hammers, etc.). The time it took participants to find the target item varied as a function of category combination. In a separate group of participants, we measured the neural responses to these object categories when items were presented in isolation. Using representational similarity analysis, we then examined whether the similarity of neural responses across different subdivisions of the visual system had the requisite structure needed to predict visual search performance. Overall, we found strong brain/behavior correlations across most of the higher-level visual system, including both the ventral and dorsal pathways when considering both macroscale sectors as well as smaller mesoscale regions. These results suggest that visual search for real-world object categories is well predicted by the stable, task-independent architecture of the visual system. NEW & NOTEWORTHY Here, we ask which neural regions have neural response patterns that correlate with behavioral performance in a visual processing task. We found that the representational structure across all of high-level visual cortex has the requisite structure to predict behavior. Furthermore, when directly comparing different neural regions, we found that they all had highly similar category-level representational structures. These results point to a ubiquitous and uniform representational structure in high-level visual cortex underlying visual object processing.


SLEEP ◽  
2018 ◽  
Vol 41 (suppl_1) ◽  
pp. A252-A252
Author(s):  
E Giora ◽  
A Galbiati ◽  
M Zucconi ◽  
L Ferini-Strambi

2013 ◽  
Vol 13 (9) ◽  
pp. 689-689 ◽  
Author(s):  
N. Siva ◽  
A. Chaparro ◽  
D. Nguyen ◽  
E. Palmer

2008 ◽  
Vol 14 (6) ◽  
pp. 990-1003 ◽  
Author(s):  
BRANDON KEEHN ◽  
LAURIE BRENNER ◽  
ERICA PALMER ◽  
ALAN J. LINCOLN ◽  
RALPH-AXEL MÜLLER

AbstractAlthough previous studies have shown that individuals with autism spectrum disorder (ASD) excel at visual search, underlying neural mechanisms remain unknown. This study investigated the neurofunctional correlates of visual search in children with ASD and matched typically developing (TD) children, using an event-related functional magnetic resonance imaging design. We used a visual search paradigm, manipulating search difficulty by varying set size (6, 12, or 24 items), distractor composition (heterogeneous or homogeneous) and target presence to identify brain regions associated with efficient and inefficient search. While the ASD group did not evidence accelerated response time (RT) compared with the TD group, they did demonstrate increased search efficiency, as measured by RT by set size slopes. Activation patterns also showed differences between ASD group, which recruited a network including frontal, parietal, and occipital cortices, and the TD group, which showed less extensive activation mostly limited to occipito-temporal regions. Direct comparisons (for both homogeneous and heterogeneous search conditions) revealed greater activation in occipital and frontoparietal regions in ASD than in TD participants. These results suggest that search efficiency in ASD may be related to enhanced discrimination (reflected in occipital activation) and increased top-down modulation of visual attention (associated with frontoparietal activation). (JINS, 2008, 14, 990–1003.)


Sign in / Sign up

Export Citation Format

Share Document