scholarly journals Aggregation of Incidence and Intensity Risk Variables to Achieve Reconciliation

Risks ◽  
2019 ◽  
Vol 7 (4) ◽  
pp. 107
Author(s):  
Clive Hunt ◽  
Ross Taplin

The aggregation of individual risks into total risk using a weighting variable multiplied by two ratio variables representing incidence and intensity is an important task for risk professionals. For example, expected loss (EL) of a loan is the product of exposure at default (EAD), probability of default (PD), and loss given default (LGD) of the loan. Simple weighted (by EAD) means of PD and LGD are intuitive summaries however they do not satisfy a reconciliation property whereby their product with the total EAD equals the sum of the individual expected losses. This makes their interpretation problematic, especially when trying to ascertain whether changes in EAD, PD, or LGD are responsible for a change in EL. We propose means for PD and LGD that have the property of reconciling at the aggregate level. Properties of the new means are explored, including how changes in EL can be attributed to changes in EAD, PD, and LGD. Other applications such as insurance where the incidence ratio is utilization rate (UR) and the intensity ratio is an average benefit (AB) are discussed and the generalization to products of more than two ratio variables provided.

Author(s):  
Gleeson Simon

This chapter discusses the internal ratings-based approach (IRB). The IRB permits a bank to use its internal models to derive risk weights for particular exposures. There are two available bases for the IRB: foundation (F-IRB), which permits the bank to model Probability of Default (PD), but relies on regulatory standard figures to determine Loss Given Default (LGD) and Exposure at Default (EAD); and advanced (A-IRB), in which all three of these are modelled. The A-IRB IRB approach models PD, LGD, EAD, and M. Both IRB approaches model both expected loss (EL) and unexpected loss (UL), and IRB banks are expected to recognise the EL derived from their models in their capital calculations. Consequently, a bank using an IRB approach will generally have a different total capital level from that which it would have if it were an SA bank.


2017 ◽  
Vol 16 (3) ◽  
pp. 157-170 ◽  
Author(s):  
Gary Van Vuuren ◽  
Riaan De Jongh ◽  
Tanja Verster

The Basel regulatory credit risk rules for expected losses require banks use downturn loss given default (LGD) estimates because the correlation between the probability of default (PD) and LGD is not captured, even though this has been repeatedly demonstrated by empirical research. A model is examined which captures this correlation using empirically-observed default frequencies and simulated LGD and default data of a loan portfolio. The model is tested under various conditions dictated by input parameters. Having established an estimate of the impact on expected losses, it is speculated that the model be calibrated using banks' own loss data to compensate for the omission of correlation dependence. Because the model relies on observed default frequencies, it could be used to adapt in real time, forcing provisions to be dynamically allocated.


2017 ◽  
Author(s):  
Mikael B Gustavsson ◽  
Jörgen Magnér ◽  
Bethanie Carney Almroth ◽  
Martin K Eriksson ◽  
Joachim Sturve ◽  
...  

Chemical pollution was monitored and assessed along the Swedish west coast. 62 of 172 analyzed organic chemicals were detected in the water phase of at least one of five monitored sites. A Concentration Addition based screening-level risk assessment indicates that all sites are put at risk from chemical contamination, with total risk quotients between 2 and 9. Only at one site did none of the individual chemicals exceeded its individual environmental threshold (PNEC, EQS). The monitoring data thus demonstrate a widespread blanket of diffuse pollution, with no clear trends amongst sites. Further issues critical for the environmental chemical risk assessment include the challenges to achieve sufficiently low levels of detection especially for hormones and cybermethrin (a pyrethroid insecticide), the appropriate consideration of non-detects and the limited availability of reliable PNECs and EQS values.


2017 ◽  
Vol 8 (7) ◽  
pp. 816-826 ◽  
Author(s):  
Gilad Feldman ◽  
Huiwen Lian ◽  
Michal Kosinski ◽  
David Stillwell

There are two conflicting perspectives regarding the relationship between profanity and dishonesty. These two forms of norm-violating behavior share common causes and are often considered to be positively related. On the other hand, however, profanity is often used to express one’s genuine feelings and could therefore be negatively related to dishonesty. In three studies, we explored the relationship between profanity and honesty. We examined profanity and honesty first with profanity behavior and lying on a scale in the lab (Study 1; N = 276), then with a linguistic analysis of real-life social interactions on Facebook (Study 2; N = 73,789), and finally with profanity and integrity indexes for the aggregate level of U.S. states (Study 3; N = 50 states). We found a consistent positive relationship between profanity and honesty; profanity was associated with less lying and deception at the individual level and with higher integrity at the society level.


Author(s):  
Christopher McCarthy-Latimer

This article describes the results from the use of deliberation in the classroom where a majority of students are working. The course included college students from Framingham State University. They discussed the issue of the economic impact of a big-box store. This analysis includes a study of the deliberative polling literature; research of the data on civic learning; an examination of the data comprising net changes in the participants' opinions and gross changes in the participants' opinions; and finally a discussion of the implications for engagement. The results illustrate that the process of deliberation affects changes in attitude items at both the individual and aggregate level.


1980 ◽  
Vol 17 (4) ◽  
pp. 516-523 ◽  
Author(s):  
William L. Moore

Two segmented methods of performing conjoint anal/sis, clustered and componential segmentation, are compared with each other as well as with individual level and totally aggregate level analyses. The two segmented methods provide insights to the data that (1) are not obtainable at the aggregate level and (2) are in a form that is more easily communicated than the information from the individual level analysis. The predictive power of the clustered segmentation method is higher than that of componential segmentation, and both are superior to the aggregate analysis but inferior to individual level analysis.


2019 ◽  
Author(s):  
David Delgado-Vaquero ◽  
José Morales-Díaz ◽  
Constancio Zamora-Ramírez

2019 ◽  
pp. 004912411987596
Author(s):  
Tim Futing Liao

In common sociological research, income inequality is measured only at the aggregate level. The main purpose of this article is to demonstrate that there is more than meets the eye when inequality is indicated by a single measure. In this article, I introduce an alternative method that evaluates individuals’ contributions to inequality as well as the between-group and within-group components of these individual contributions. I first highlight three common inequality measures, the Gini index and two generalized entropy measures—Theil’s T and Theil’s L indices—by presenting their individual components as a method for evaluating inequality. Five artificial data examples illustrate the use of these individual components first. An empirical analysis of the 2007 and 2017 Current Population Survey data then focuses on the differences in inequality revealed by such individual inequality components between the 2007 and 2017. The individual-level inequality measures can reveal patterns of inequality concealed by single measures at the aggregate level. In particular, the Gini individual measures differentiate cases better than the generalized entropy measures and tend to have smaller standard errors in a regression analysis.


Societies ◽  
2020 ◽  
Vol 10 (1) ◽  
pp. 20
Author(s):  
Sarah Harrison

Electoral psychology is defined as any model based on human psychology that is used to explain any electoral experience or outcome at the individual or aggregate level. Electoral psychology can also be an interface with other crucial aspects of the vote. For example, the interface between electoral psychology and electoral organization constitutes electoral ergonomics. The very nature of the models tested in electoral psychology has also led scholars in the field to complement mainstream social science methodologies with their own specific methodological approaches in order to capture the subconscious component of the vote and the subtle nature of the psychological processes determining the electoral experience and the way in which it permeates citizens’ thoughts and lives. After defining electoral psychology, this introductory article scopes its analytical roots and contemporary relevance, focuses on the importance of switching from “institution-centric” to “people-centric” conceptions of electoral behavior, and notably how it redefines key concepts such as electoral identity and consistency, and approaches questions of personality, morality, memory, identity, and emotions in electoral psychological models. Then, it discusses some of the unique methodological challenges that the field faces, notably when it comes to analyzing largely subconscious phenomena, and addresses them, before explaining how the various contributions to this Special Issue give a flavor of the scope and approaches of electoral psychology contributions to electoral studies.


Sign in / Sign up

Export Citation Format

Share Document