scholarly journals Directed information flow—A model free measure to analyze causal interactions in event related EEG-MEG-experiments

2008 ◽  
Vol 29 (2) ◽  
pp. 193-206 ◽  
Author(s):  
Hermann Hinrichs ◽  
Toemme Noesselt ◽  
Hans-Jochen Heinze
IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 176634-176645
Author(s):  
Wei Gao ◽  
Yan Shi ◽  
Shanzhi Chen

2013 ◽  
Vol 12 (04) ◽  
pp. 1350019 ◽  
Author(s):  
XUEJIAO WANG ◽  
PENGJIAN SHANG ◽  
JINGJING HUANG ◽  
GUOCHEN FENG

Recently, an information theoretic inspired concept of transfer entropy has been introduced by Schreiber. It aims to quantify in a nonparametric and explicitly nonsymmetric way the flow of information between two time series. This model-free based on Shannon entropy approach in principle allows us to detect statistical dependencies of all types, i.e., linear and nonlinear temporal correlations. However, we always analyze the transfer entropy based on the data, which is discretized into three partitions by some coarse graining. Naturally, we are interested in investigating the effect of the data discretization of the two series on the transfer entropy. In our paper, we analyze the results based on the data which are generated by the linear modeling and the ARFIMA modeling, as well as the dataset consists of seven indices during the period 1992–2002. The results show that the higher the degree of data discretization get, the larger the value of the transfer entropy will be, besides, the direction of the information flow is unchanged along with the degree of data discretization.


2012 ◽  
Vol 2012 ◽  
pp. 1-16 ◽  
Author(s):  
Ying Liu ◽  
Selin Aviyente

Effective connectivity refers to the influence one neural system exerts on another and corresponds to the parameter of a model that tries to explain the observed dependencies. In this sense, effective connectivity corresponds to the intuitive notion of coupling or directed causal influence. Traditional measures to quantify the effective connectivity include model-based methods, such as dynamic causal modeling (DCM), Granger causality (GC), and information-theoretic methods. Directed information (DI) has been a recently proposed information-theoretic measure that captures the causality between two time series. Compared to traditional causality detection methods based on linear models, directed information is a model-free measure and can detect both linear and nonlinear causality relationships. However, the effectiveness of using DI for capturing the causality in different models and neurophysiological data has not been thoroughly illustrated to date. In addition, the advantage of DI compared to model-based measures, especially those used to implement Granger causality, has not been fully investigated. In this paper, we address these issues by evaluating the performance of directed information on both simulated data sets and electroencephalogram (EEG) data to illustrate its effectiveness for quantifying the effective connectivity in the brain.


2016 ◽  
Vol 39 (8) ◽  
pp. 1253-1261 ◽  
Author(s):  
Xiaoli Luan ◽  
Yang Min ◽  
Zhengtao Ding ◽  
Fei Liu

In this study, the given-time H∞ consensus problem over networks with directed information flow and Markov jump topologies is addressed. Our focus is on keeping the disagreement dynamics of networks confined within the prescribed bound in the fixed time interval. Compared with the asymptotical consensus in infinite settling time, the proposed algorithm is less conservative. In addition, a new model transformation approach is presented to make the design results more advantageous in commonality. Simulation results show the effectiveness of the proposed controller, and reveal that the prescribed boundary of the disagreement trajectory has an effect on disturbance rejection performance.


Sign in / Sign up

Export Citation Format

Share Document