scholarly journals PROPERTY RISK UNDER SOLVENCY II: EFFECTS OF DIFFERENT UNSMOOTHING TECHNIQUES

2019 ◽  
Vol 25 (1) ◽  
pp. 1-19
Author(s):  
Pablo Durán Santomil ◽  
Luís Otero González ◽  
Onofre Martorell Cunill ◽  
Anna M. Gil-Lafuente

Solvency II imposes risk-based capital requirements on EU insurance companies. This paper evaluates the property risk standard model proposed. The calibration was performed from the IPD UK monthly index total returns for the period between December 1986 and December 2009. In general, it is considered that returns derived from valuation-based indices are smoother than those derived from transaction-based indices. This paper contributes to the existing literature by applying various unsmoothing techniques to this index. The results show that the capital requirements, applying the same calculation method (historical value at risk at the 99.5% confidence level) as in the calibration of the standard model, are generally bigger than those proposed in the standard model of Solvency II.

Author(s):  
Răzvan Tudor ◽  
Dumitru Badea

Abstract This paper aims at covering and describing the shortcomings of various models used to quantify and model the operational risk within insurance industry with a particular focus on Romanian specific regulation: Norm 6/2015 concerning the operational risk issued by IT systems. While most of the local insurers are focusing on implementing the standard model to compute the Operational Risk solvency capital required, the local regulator has issued a local norm that requires to identify and assess the IT based operational risks from an ISO 27001 perspective. The challenges raised by the correlations assumed in the Standard model are substantially increased by this new regulation that requires only the identification and quantification of the IT operational risks. The solvency capital requirement stipulated by the implementation of Solvency II doesn’t recommend a model or formula on how to integrate the newly identified risks in the Operational Risk capital requirements. In this context we are going to assess the academic and practitioner’s understanding in what concerns: The Frequency-Severity approach, Bayesian estimation techniques, Scenario Analysis and Risk Accounting based on risk units, and how they could support the modelling of operational risk that are IT based. Developing an internal model only for the operational risk capital requirement proved to be, so far, costly and not necessarily beneficial for the local insurers. As the IT component will play a key role in the future of the insurance industry, the result of this analysis will provide a specific approach in operational risk modelling that can be implemented in the context of Solvency II, in a particular situation when (internal or external) operational risk databases are scarce or not available.


Risks ◽  
2019 ◽  
Vol 7 (2) ◽  
pp. 58 ◽  
Author(s):  
Rokas Gylys ◽  
Jonas Šiaulys

The primary objective of this work is to analyze model based Value-at-Risk associated with mortality risk arising from issued term life assurance contracts and to compare the results with the capital requirements for mortality risk as determined using Solvency II Standard Formula. In particular, two approaches to calculate Value-at-Risk are analyzed: one-year VaR and run-off VaR. The calculations of Value-at-Risk are performed using stochastic mortality rates which are calibrated using the Lee-Carter model fitted using mortality data of selected European countries. Results indicate that, depending on the approach taken to calculate Value-at-Risk, the key factors driving its relative size are: sensitivity of technical provisions to the latest mortality experience, volatility of mortality rates in a country, policy term and benefit formula. Overall, we found that Solvency II Standard Formula on average delivers an adequate capital requirement, however, we also highlight particular situations where it could understate or overstate portfolio specific model based Value-at-Risk for mortality risk.


Risks ◽  
2018 ◽  
Vol 6 (3) ◽  
pp. 74 ◽  
Author(s):  
Fabiana Gómez ◽  
Jorge Ponce

This paper provides a rationale for the macro-prudential regulation of insurance companies, where capital requirements increase in their contribution to systemic risk. In the absence of systemic risk, the formal model in this paper predicts that optimal regulation may be implemented by capital regulation (similar to that observed in practice, e.g., Solvency II ) and by actuarially fair technical reserve. However, these instruments are not sufficient when insurance companies are exposed to systemic risk: prudential regulation should also add a systemic component to capital requirements that is non-decreasing in the firm’s exposure to systemic risk. Implementing the optimal policy implies separating insurance firms into two categories according to their exposure to systemic risk: those with relatively low exposure should be eligible for bailouts, while those with high exposure should not benefit from public support if a systemic event occurs.


2021 ◽  
Vol 2021 (2) ◽  
Author(s):  
E. Cortina Gil ◽  
◽  
A. Kleimenova ◽  
E. Minucci ◽  
S. Padolski ◽  
...  

Abstract The NA62 experiment at the CERN SPS reports a study of a sample of 4 × 109 tagged π0 mesons from K+ → π+π0(γ), searching for the decay of the π0 to invisible particles. No signal is observed in excess of the expected background fluctuations. An upper limit of 4.4 × 10−9 is set on the branching ratio at 90% confidence level, improving on previous results by a factor of 60. This result can also be interpreted as a model- independent upper limit on the branching ratio for the decay K+ → π+X, where X is a particle escaping detection with mass in the range 0.110–0.155 GeV/c2 and rest lifetime greater than 100 ps. Model-dependent upper limits are obtained assuming X to be an axion-like particle with dominant fermion couplings or a dark scalar mixing with the Standard Model Higgs boson.


2021 ◽  
Vol 2021 (3) ◽  
Author(s):  
A. M. Sirunyan ◽  
◽  
A. Tumasyan ◽  
W. Adam ◽  
T. Bergauer ◽  
...  

Abstract A search is presented for a Higgs boson that is produced via vector boson fusion and that decays to an undetected particle and an isolated photon. The search is performed by the CMS collaboration at the LHC, using a data set corresponding to an integrated luminosity of 130 fb−1, recorded at a center-of-mass energy of 13 TeV in 2016–2018. No significant excess of events above the expectation from the standard model background is found. The results are interpreted in the context of a theoretical model in which the undetected particle is a massless dark photon. An upper limit is set on the product of the cross section for production via vector boson fusion and the branching fraction for such a Higgs boson decay, as a function of the Higgs boson mass. For a Higgs boson mass of 125 GeV, assuming the standard model production rates, the observed (expected) 95% confidence level upper limit on the branching fraction is 3.5 (2.8)%. This is the first search for such decays in the vector boson fusion channel. Combination with a previous search for Higgs bosons produced in association with a Z boson results in an observed (expected) upper limit on the branching fraction of 2.9 (2.1)% at 95% confidence level.


2021 ◽  
Vol 2021 (2) ◽  
Author(s):  
G. Aad ◽  
◽  
B. Abbott ◽  
D. C. Abbott ◽  
A. Abed Abud ◽  
...  

Abstract A search for dark matter is conducted in final states containing a photon and missing transverse momentum in proton-proton collisions at $$ \sqrt{s} $$ s = 13 TeV. The data, collected during 2015–2018 by the ATLAS experiment at the CERN LHC, correspond to an integrated luminosity of 139 fb−1. No deviations from the predictions of the Standard Model are observed and 95% confidence-level upper limits between 2.45 fb and 0.5 fb are set on the visible cross section for contributions from physics beyond the Standard Model, in different ranges of the missing transverse momentum. The results are interpreted as 95% confidence-level limits in models where weakly interacting dark-matter candidates are pair-produced via an s-channel axial-vector or vector mediator. Dark-matter candidates with masses up to 415 (580) GeV are excluded for axial-vector (vector) mediators, while the maximum excluded mass of the mediator is 1460 (1470) GeV. In addition, the results are expressed in terms of 95% confidence-level limits on the parameters of a model with an axion-like particle produced in association with a photon, and are used to constrain the coupling gaZγ of an axion-like particle to the electroweak gauge bosons.


2018 ◽  
Vol 12 (2) ◽  
pp. 233-248 ◽  
Author(s):  
J. Lévy Véhel

AbstractIn this note, we provide a simple example of regulation risk. The idea is that, in certain situations, the very prudential rules (or, rather, some of them) imposed by the regulator in the framework of the Basel II/III Accords or Solvency II directive are themselves the source of a systemic risk. The instance of regulation risk that we bring to light in this work can be summarised as follows: wrongly assuming that prices evolve in a continuous fashion when they may in fact display large negative jumps, and trying to minimise Value at Risk (VaR) under a constraint of minimal volume of activity leads in effect to behaviours that will maximise VaR. Although much stylised, our analysis highlights some pitfalls of model-based regulation.


2018 ◽  
Vol 46 ◽  
pp. 1860058
Author(s):  
Ye Chen

Latest results of searches for heavy Higgs bosons in fermionic final states are presented using the CMS detector at the LHC. Results are based on pp collision data collected at centre-of-mass energies of 8 and 13 TeV which have been interpreted according to different extensions of the Standard Model such as MSSM, 2HDM, and NMSSM. These searches look for evidence of other scalar or pseudoscalar bosons, in addition to the observed SM-like 125 GeV Higgs boson, and set 95% confidence level upper limits in fermionic final states and benchmark models explored. The talk reviews briefly the major results obtained by the CMS Collaboration during Run I, and presents the most recent searches performed during Run II.


2022 ◽  
Vol 10 (4) ◽  
pp. 508-517
Author(s):  
Umiyatun Muthohiroh ◽  
Rita Rahmawati ◽  
Dwi Ispriyanti

A portfolio is a combination of two or more securities as investment targets for a certain period of time with certain conditions. The Markowitz method is a method that emphasizes efforts to maximize return expectations and can minimize stock risk. One method that can be used to measure risk is Expected Shortfall (ES). ES is an expected measure of risk whose value is above Value-at-Risk (VaR). To make it easier to calculate optimal portfolios with the Markowitz method and risk analysis with ES, an application was made using the Matlab GUI. The data used in this study consisted of three JII stocks including CPIN, CTRA, and BSDE stocks. The results of the portfolio formation with the Markowitz method obtained an optimal portfolio, namely the combination of CPIN = 34.7% and BSDE = 65.3% stocks. At the 95% confidence level, the ES value of 0.206727 is greater than the VaR value (0.15512).  


Sign in / Sign up

Export Citation Format

Share Document