Multiple Time-Scale Market Dependency: Application of Wavelet Analysis on High Frequency Data of Asian Markets

2009 ◽  
Author(s):  
Lixia Loh ◽  
Gurcharan S.
2004 ◽  
Vol 07 (05) ◽  
pp. 615-643 ◽  
Author(s):  
ERHAN BAYRAKTAR ◽  
H. VINCENT POOR ◽  
K. RONNIE SIRCAR

S&P 500 index data sampled at one-minute intervals over the course of 11.5 years (January 1989–May 2000) is analyzed, and in particular the Hurst parameter over segments of stationarity (the time period over which the Hurst parameter is almost constant) is estimated. An asymptotically unbiased and efficient estimator using the log-scale spectrum is employed. The estimator is asymptotically Gaussian and the variance of the estimate that is obtained from a data segment of N points is of order [Formula: see text]. Wavelet analysis is tailor-made for the high frequency data set, since it has low computational complexity due to the pyramidal algorithm for computing the detail coefficients. This estimator is robust to additive non-stationarities, and here it is shown to exhibit some degree of robustness to multiplicative non-stationarities, such as seasonalities and volatility persistence, as well. This analysis suggests that the market became more efficient in the period 1997–2000.


Fractals ◽  
2002 ◽  
Vol 10 (01) ◽  
pp. 13-18 ◽  
Author(s):  
YOSHIAKI KUMAGAI

We propose a new method to describe scaling behavior of time series. We introduce an extension of extreme values. Using these extreme values determined by a scale, we define some functions. Moreover, using these functions, we can measure a kind of fractal dimension — fold dimension. In financial high frequency data, observations can occur at varying time intervals. Using these functions, we can analyze non-equidistant data without interpolation or evenly sampling. Further, the problem of choosing the appropriate time scale is avoided. Lastly, these functions are related to a viewpoint of investor whose transaction costs coincide with the spread.


Author(s):  
Josip Arnerić

AbstractAvailability of high-frequency data, in line with IT developments, enables the use of Availability of high-frequency data, in line with IT developments, enables the use of more information to estimate not only the variance (volatility), but also higher realized moments and the entire realized distribution of returns. Old-fashioned approaches use only closing prices and assume that underlying distribution is time-invariant, which makes traditional forecasting models unreliable. Moreover, time-varying realized moments support findings that returns are not identically distributed across trading days. The objective of the paper is to find an appropriate data-driven distribution of returns using high-frequency data. The kernel estimation method is applied to DAX intraday prices, which balances between the bias and the variance of the realized moments with respect to the bandwidth selection as well as the sampling frequency selection. The main finding is that the kernel bandwidth is strongly related to the sampling frequency at the slow-time-time scale when applying a two-scale estimator, while the fast-time-time scale sampling frequency is held fixed. The realized kernel density estimation enriches the literature by providing the best data-driven proxy of the true but unknown probability density function of returns, which can be used as a benchmark in comparison against ex-ante or implied driven moments.


2017 ◽  
Author(s):  
Rim mname Lamouchi ◽  
Russell mname Davidson ◽  
Ibrahim mname Fatnassi ◽  
Abderazak Ben mname Maatoug

Sign in / Sign up

Export Citation Format

Share Document