scholarly journals Modelling fluctuations of financial time series: from cascade process to stochastic volatility model

2000 ◽  
Vol 17 (3) ◽  
pp. 537-548 ◽  
Author(s):  
J.F. Muzy ◽  
J. Delour ◽  
E. Bacry
2021 ◽  
Author(s):  
◽  
Konstantin Kvatch

<p>The thesis will have two main parts. First, let us start with an example. In finance, the standard version of the Black-Scholes formula is a beautiful closed form solution used to price European options. This famous formula is ingenious, but has a flaw that relegates it to something that should be admired, and perhaps not be used in the real world. It relies on the assumption that prices of shares evolve according to geometric Brownian motion. This means that we are willing to accept that extreme shocks to prices are almost impossible. Is this a realistic assumption? Of course not. The stock market crashes of 1929, 1987 are great examples to show that extreme events do happen. More recently, the 1997 Asian crisis and 2000 crash of the NASDAQ show that in addition, such events are not so rare. These jumps occur even more frequently and are larger in magnitude for share prices of individual companies. This problem is by no means new, and a plethora of models and pricing techniques have been developed. The standard Black-Scholes formula is just one example, but this is simply illustration of the matter at hand. The process that we use to model a financial time series is of paramount importance, whether we do it for forecasting purposes or for pricing financial derivatives. If we choose to use a model that does not capture the key empirical aspects of the data, then any subsequent inference may be very unfavourably biased. It is because of this problem that we should investigate the more standard modeling that assumes continuity and normal or log-normal distribution of financial time series. We will begin from the very basics and we will see that this is a wonderful piece of theory, deserving of the reputation it has in being simple, groundbreaking and extremely useful. This work should bring us to a position where we can evaluate a second goal. Stochastic processes with jumps and "heavy-tails" have existed for some time, but have begun to filter through to the financial industry only recently. This lag is due to the perceived added conceptual difficulty in the introduction of such models, although we will see that this should not be the case. There is plenty of real evidence that nancial time series exhibit discontinuous behaviour and that these series are far from normally or log-normally distributed. Rather than looking at standard models as correct, and jump or stochastic volatility models as complicated, we should look upon standard models as educational but not sufficient for the real world. Stochastic volatility or jump models should instead be viewed as natural. The theme of the thesis is the importance of choosing a correct model for the underlying process. Although we may speak of the implications of some models to hedging, we will not actually look at specific hedging techniques. The particular aspect of pricing is also not considered in full scope although we will see the Black-Scholes pricing formula. We will consider that the main problem is to specify the model correctly where the method of pricing is a subsequent technicality. In examples we may take pricing tools like Monte-Carlo simulation as a given. We will not strive for full generality or formality, but rather take a physical approach and aim for clarity and understanding. Let us now move on to the beginning, with the introduction of our primary source of randomness.</p>


2019 ◽  
Vol 17 (4) ◽  
pp. 22
Author(s):  
Omar Abbara ◽  
Mauricio Zevallos

<p>The paper assesses the method proposed by Shumway and Stoffer (2006, Chapter 6, Section 10) to estimate the parameters and volatility of stochastic volatility models. First, the paper presents a Monte Carlo evaluation of the parameter estimates considering several distributions for the perturbations in the observation equation. Second, the method is assessed empirically, through backtesting evaluation of VaR forecasts of the S&amp;P 500 time series returns. In both analyses, the paper also evaluates the convenience of using the Fuller transformation.</p>


2009 ◽  
Vol 18 (08) ◽  
pp. 1381-1396 ◽  
Author(s):  
TETSUYA TAKAISHI

The hybrid Monte Carlo (HMC) algorithm is applied for the Bayesian inference of the stochastic volatility (SV) model. We use the HMC algorithm for the Markov chain Monte Carlo updates of volatility variables of the SV model. First we compute parameters of the SV model by using the artificial financial data and compare the results from the HMC algorithm with those from the Metropolis algorithm. We find that the HMC algorithm decorrelates the volatility variables faster than the Metropolis algorithm. Second we make an empirical study for the time series of the Nikkei 225 stock index by the HMC algorithm. We find the similar correlation behavior for the sampled data to the results from the artificial financial data and obtain a ϕ value close to one (ϕ ≈ 0.977), which means that the time series has the strong persistency of the volatility shock.


Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 689
Author(s):  
Łukasz Lenart ◽  
Anna Pajor ◽  
Łukasz Kwiatkowski

In the paper, we begin with introducing a novel scale mixture of normal distribution such that its leptokurticity and fat-tailedness are only local, with this “locality” being separately controlled by two censoring parameters. This new, locally leptokurtic and fat-tailed (LLFT) distribution makes a viable alternative for other, globally leptokurtic, fat-tailed and symmetric distributions, typically entertained in financial volatility modelling. Then, we incorporate the LLFT distribution into a basic stochastic volatility (SV) model to yield a flexible alternative for common heavy-tailed SV models. For the resulting LLFT-SV model, we develop a Bayesian statistical framework and effective MCMC methods to enable posterior sampling of the parameters and latent variables. Empirical results indicate the validity of the LLFT-SV specification for modelling both “non-standard” financial time series with repeating zero returns, as well as more “typical” data on the S&P 500 and DAX indices. For the former, the LLFT-SV model is also shown to markedly outperform a common, globally heavy-tailed, t-SV alternative in terms of density forecasting. Applications of the proposed distribution in more advanced SV models seem to be easily attainable.


Author(s):  
Zouhaier Dhifaoui

Determinism and non-linear behaviour in log-return and conditional volatility time series of the stock market index is examined for twenty-six countries. For this goal, the principal statistical techniques used in this study are a robust estimator of correlation dimension, a normalized non-linear prediction error, and pseudo-periodic surrogate data method. The proposed approach indicates, first, the stochastic behaviour of all log-return time series. Second, the inability of local linear, ARMA, or state- dependent noise models (such as ARCH, GARCH, and EGARCH) to describe its structure for the frontier, emerging, and developed markets. The same stochastic behaviour of conditional volatility time series, estimated by the stochastic volatility model with moving average innovations, is detected. This finding proves the efficiency of the stochastic volatility model compared with some analysed types of GARCH models for all studied markets. JEL Classification: C12, C52, D53, E44


Sign in / Sign up

Export Citation Format

Share Document