scholarly journals Predictive processing account of action perception: Evidence from effective connectivity in the action observation network

Cortex ◽  
2020 ◽  
Vol 128 ◽  
pp. 132-142 ◽  
Author(s):  
Burcu A. Urgen ◽  
Ayse P. Saygin
2019 ◽  
Author(s):  
Burcu A. Urgen ◽  
Ayse P. Saygin

AbstractVisual perception of actions is supported by a network of brain regions in the occipito-temporal, parietal, and premotor cortex in the primate brain, known as the Action Observation Network (AON). Although there is a growing body of research that characterizes the functional properties of each node of this network, the communication and direction of information flow between the nodes is unclear. According to the predictive coding account of action perception, this network is not a purely feedforward system but has feedback connections through which prediction error signals are communicated between the regions of the AON. In the present study, we investigated the effective connectivity of the AON in an experimental setting where the human subjects’ predictions about the observed agent were violated, using fMRI and Dynamical Causal Modeling (DCM). We specifically examined the influence of the lowest and highest nodes in the AON hierarchy, pSTS and ventral premotor cortex, respectively, on the middle node, inferior parietal cortex during prediction violation. Our DCM results suggest that the influence on the inferior parietal node is through a feedback connection from ventral premotor cortex during perception of actions that violate people’s predictions.


2020 ◽  
Vol 117 (23) ◽  
pp. 13151-13161
Author(s):  
Lucia Amoruso ◽  
Alessandra Finisguerra ◽  
Cosimo Urgesi

Understanding object-directed actions performed by others is central to everyday life. This ability is thought to rely on the interaction between the dorsal action observation network (AON) and a ventral object recognition pathway. On this view, the AON would encode action kinematics, and the ventral pathway, the most likely intention afforded by the objects. However, experimental evidence supporting this model is still scarce. Here, we aimed to disentangle the contribution of dorsal vs. ventral pathways to action comprehension by exploiting their differential tuning to low-spatial frequencies (LSFs) and high-spatial frequencies (HSFs). We filtered naturalistic action images to contain only LSF or HSF and measured behavioral performance and corticospinal excitability (CSE) using transcranial magnetic stimulation (TMS). Actions were embedded in congruent or incongruent scenarios as defined by the compatibility between grips and intentions afforded by the contextual objects. Behaviorally, participants were better at discriminating congruent actions in intact than LSF images. This effect was reversed for incongruent actions, with better performance for LSF than intact and HSF. These modulations were mirrored at the neurophysiological level, with greater CSE facilitation for congruent than incongruent actions for HSF and the opposite pattern for LSF images. Finally, only for LSF did we observe CSE modulations according to grip kinematics. While results point to differential dorsal (LSF) and ventral (HSF) contributions to action comprehension for grip and context encoding, respectively, the negative congruency effect for LSF images suggests that object processing may influence action perception not only through ventral-to-dorsal connections, but also through a dorsal-to-dorsal route involved in predictive processing.


Author(s):  
Gloria Pizzamiglio ◽  
Zuo Zhang ◽  
James Kolasinski ◽  
Jane M. Riddoch ◽  
Richard E. Passingham ◽  
...  

2013 ◽  
Vol 35 (1) ◽  
pp. 22-28 ◽  
Author(s):  
Miyuki Tamura ◽  
Yoshiya Moriguchi ◽  
Shigekazu Higuchi ◽  
Akiko Hida ◽  
Minori Enomoto ◽  
...  

2011 ◽  
Vol 7 (1) ◽  
pp. 64-80 ◽  
Author(s):  
Daniel J. Shaw ◽  
Marie-Helene Grosbras ◽  
Gabriel Leonard ◽  
G. Bruce Pike ◽  
Tomáš Paus

2011 ◽  
Vol 22 (3) ◽  
pp. 668-679 ◽  
Author(s):  
Luca Turella ◽  
Federico Tubaldi ◽  
Michael Erb ◽  
Wolfgang Grodd ◽  
Umberto Castiello

Author(s):  
Davide Albertini ◽  
Marco Lanzilotto ◽  
Monica Maranesi ◽  
Luca Bonini

The neural processing of others' observed actions recruits a large network of brain regions (the action observation network, AON), in which frontal motor areas are thought to play a crucial role. Since the discovery of mirror neurons (MNs) in the ventral premotor cortex, it has been assumed that their activation was conditional upon the presentation of biological rather than nonbiological motion stimuli, supporting a form of direct visuomotor matching. Nonetheless, nonbiological observed movements have rarely been used as control stimuli to evaluate visual specificity, thereby leaving the issue of similarity among neural codes for executed actions and biological or nonbiological observed movements unresolved. Here, we addressed this issue by recording from two nodes of the AON that are attracting increasing interest, namely the ventro-rostral part of the dorsal premotor area F2 and the mesial pre-supplementary motor area F6 of macaques while they 1) executed a reaching-grasping task, 2) observed an experimenter performing the task, and 3) observed a nonbiological effector moving in the same context. Our findings revealed stronger neuronal responses to the observation of biological than nonbiological movement, but biological and nonbiological visual stimuli produced highly similar neural dynamics and relied on largely shared neural codes, which in turn remarkably differed from those associated with executed actions. These results indicate that, in highly familiar contexts, visuo-motor remapping processes in premotor areas hosting MNs are more complex and flexible than predicted by a direct visuomotor matching hypothesis.


2009 ◽  
Vol 20 (2) ◽  
pp. 486-491 ◽  
Author(s):  
A. A. Sokolov ◽  
A. Gharabaghi ◽  
M. S. Tatagiba ◽  
M. Pavlova

PLoS ONE ◽  
2015 ◽  
Vol 10 (8) ◽  
pp. e0137020 ◽  
Author(s):  
Kaat Alaerts ◽  
Franca Geerlings ◽  
Lynn Herremans ◽  
Stephan P. Swinnen ◽  
Judith Verhoeven ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document