scholarly journals Using Transfer Entropy to Measure Information Flows Between Financial Markets

Author(s):  
Franziska J. Peter ◽  
Thomas Dimpfl ◽  
Luis Huergo
Entropy ◽  
2019 ◽  
Vol 21 (11) ◽  
pp. 1124 ◽  
Author(s):  
Jan Korbel ◽  
Xiongfei Jiang ◽  
Bo Zheng

In this paper, we analyze information flows between communities of financial markets, represented as complex networks. Each community, typically corresponding to a business sector, represents a significant part of the financial market and the detection of interactions between communities is crucial in the analysis of risk spreading in the financial markets. We show that the transfer entropy provides a coherent description of information flows in and between communities, also capturing non-linear interactions. Particularly, we focus on information transfer of rare events—typically large drops which can spread in the network. These events can be analyzed by Rényi transfer entropy, which enables to accentuate particular types of events. We analyze transfer entropies between communities of the five largest financial markets and compare the information flows with the correlation network of each market. From the transfer entropy picture, we can also identify the non-linear interactions, which are typical in the case of extreme events. The strongest flows can be typically observed between specific types of business sectors—financial sectors is the most significant example.


Author(s):  
Nicoló Andrea Caserini ◽  
Paolo Pagnottoni

AbstractIn this paper we propose to study the dynamics of financial contagion between the credit default swap (CDS) and the sovereign bond markets through effective transfer entropy, a model-free methodology which enables to overcome the required hypotheses of classical price discovery measures in the statistical and econometric literature, without being restricted to linear dynamics. By means of effective transfer entropy we correct for small sample biases which affect the traditional Shannon transfer entropy, as well as we are able to conduct inference on the estimated directional information flows. In our empirical application, we analyze the CDS and bond market data for eight countries of the European Union, and aim to discover which of the two assets is faster at incorporating the information on the credit risk of the underlying sovereign. Our results show a clear and statistically significant prominence of the bond market for pricing the sovereign credit risk, especially during the crisis period. During the post-crisis period, instead, a few countries behave dissimilarly from the others, in particular Spain and the Netherlands.


Author(s):  
Tran Thi Tuan Anh

This paper uses transfer entropy to measure and identify the information flows between stock markets in the ASEAN region. Data on daily closing stock indices, including Vietnam, the Philippines, Malaysia, Indonesia, Thailand, and Singapore, are collected for the period from March 2012 to October 2019 to calculate these transfer entropies. The research results of this article can be considered in two aspects: one is, how information flow originating from one market will be accepted by other markets and secondly, information flow that markets receive. From the perspective of incoming transfer entropy, Vietnam is the country most affected by information from the other ASEAN markets while Indonesia and Malaysia are the least affected. In terms of outgoing entropy, Thailand is the largest source of information flow to the ASEAN markets. Malaysia and the Philippines are the two countries that receive minor information impact from other countries. The research also reveals that the Singapore stock market is rather separate from the other ASEAN countries. The research results also imply that, for investors and policymakers, defining the information flows among ASEAN stock markets can help to predict market movements, thereby developing a suitable investment strategy or establishing appropriate management policies.


2008 ◽  
Vol 11 (01) ◽  
pp. 17-41 ◽  
Author(s):  
NIHAT AY ◽  
DANIEL POLANI

We use a notion of causal independence based on intervention, which is a fundamental concept of the theory of causal networks, to define a measure for the strength of a causal effect. We call this measure "information flow" and compare it with known information flow measures such as transfer entropy.


2018 ◽  
pp. 76-81
Author(s):  
Oleksandr Lavryk ◽  
Karina Jafarbaghi

Introduction. Significant importance of the modern research of investment processes in modern financial markets has determined the interrelations of the financial and industrial sectors of the national economy. We have studied the relevant levers and the following interactions, which greatly influenced the economy as a whole, expanding the great opportunities for continuous improvement of the stability of economic activity and its efficiency. In the face of a constant and significant shortage of self-financing by national companies, the current problem of attracting funds is very important. The powerful market-based financial instruments should be used. Purpose. The article aims to create the following conceptual framework for increasing the stability and efficiency of modern investment processes in financial markets. Method (methodology). The estimation and analysis methods of risks and financing efficiency, the estimation of financial and economic performance, statistical methods have been used to solve the tasks. Results. The practice of efficient management of modern investment processes at the macro level and at the country level will be characterized by considerable uncertainty of the functions of such a system of management, low level of coordination of their interaction, poor orderliness of information flows. Therefore, we offer schemes of information flows and functional structure of management of investment processes in modern financial markets.


2020 ◽  
Author(s):  
David P. Shorten ◽  
Richard E. Spinney ◽  
Joseph T. Lizier

AbstractTransfer entropy (TE) is a widely used measure of directed information flows in a number of domains including neuroscience. Many real-world time series in which we are interested in information flows come in the form of (near) instantaneous events occurring over time, including the spiking of biological neurons, trades on stock markets and posts to social media. However, there exist severe limitations to the current approach to TE estimation on such event-based data via discretising the time series into time bins: it is not consistent, has high bias, converges slowly and cannot simultaneously capture relationships that occur with very fine time precision as well as those that occur over long time intervals. Building on recent work which derived a theoretical framework for TE in continuous time, we present an estimation framework for TE on event-based data and develop a k-nearest-neighbours estimator within this framework. This estimator is provably consistent, has favourable bias properties and converges orders of magnitude more quickly than the discrete-time estimator on synthetic examples. We also develop a local permutation scheme for generating null surrogate time series to test for the statistical significance of the TE and, as such, test for the conditional independence between the history of one point process and the updates of another — signifying the lack of a causal connection under certain weak assumptions. Our approach is capable of detecting conditional independence or otherwise even in the presence of strong pairwise time-directed correlations. The power of this approach is further demonstrated on the inference of the connectivity of biophysical models of a spiking neural circuit inspired by the pyloric circuit of the crustacean stomatogastric ganglion, succeeding where previous related estimators have failed.AUTHOR SUMMARYTransfer Entropy (TE) is an information-theoretic measure commonly used in neuroscience to measure the directed statistical dependence between a source and a target time series, possibly also conditioned on other processes. Along with measuring information flows, it is used for the inference of directed functional and effective networks from time series data. The currently-used technique for estimating TE on neural spike trains first time-discretises the data and then applies a straightforward or “plug-in” information-theoretic estimation procedure. This approach has numerous drawbacks: it is very biased, it cannot capture relationships occurring on both fine and large timescales simultaneously, converges very slowly as more data is obtained, and indeed does not even converge to the correct value. We present a new estimator for TE which operates in continuous time, demonstrating via application to synthetic examples that it addresses these problems, and can reliably differentiate statistically significant flows from (conditionally) independent spike trains. Further, we also apply it to more biologically-realistic spike trains obtained from a biophysical model of the pyloric circuit of the crustacean stomatogastric ganglion; our correct inference of the underlying connection structure here provides an important validation for our approach where similar methods have previously failed


2021 ◽  
Vol 17 (4) ◽  
pp. e1008054
Author(s):  
David P. Shorten ◽  
Richard E. Spinney ◽  
Joseph T. Lizier

Transfer entropy (TE) is a widely used measure of directed information flows in a number of domains including neuroscience. Many real-world time series for which we are interested in information flows come in the form of (near) instantaneous events occurring over time. Examples include the spiking of biological neurons, trades on stock markets and posts to social media, amongst myriad other systems involving events in continuous time throughout the natural and social sciences. However, there exist severe limitations to the current approach to TE estimation on such event-based data via discretising the time series into time bins: it is not consistent, has high bias, converges slowly and cannot simultaneously capture relationships that occur with very fine time precision as well as those that occur over long time intervals. Building on recent work which derived a theoretical framework for TE in continuous time, we present an estimation framework for TE on event-based data and develop a k-nearest-neighbours estimator within this framework. This estimator is provably consistent, has favourable bias properties and converges orders of magnitude more quickly than the current state-of-the-art in discrete-time estimation on synthetic examples. We demonstrate failures of the traditionally-used source-time-shift method for null surrogate generation. In order to overcome these failures, we develop a local permutation scheme for generating surrogate time series conforming to the appropriate null hypothesis in order to test for the statistical significance of the TE and, as such, test for the conditional independence between the history of one point process and the updates of another. Our approach is shown to be capable of correctly rejecting or accepting the null hypothesis of conditional independence even in the presence of strong pairwise time-directed correlations. This capacity to accurately test for conditional independence is further demonstrated on models of a spiking neural circuit inspired by the pyloric circuit of the crustacean stomatogastric ganglion, succeeding where previous related estimators have failed.


Author(s):  
Dorje C. Brody ◽  
Lane P. Hughston ◽  
Andrea Macrina

Sign in / Sign up

Export Citation Format

Share Document