Using artificial censoring to improve extreme tail quantile estimates

2018 ◽  
Vol 67 (4) ◽  
pp. 791-812 ◽  
Author(s):  
Yang Liu ◽  
Matías Salibián‐Barrera ◽  
Ruben H. Zamar ◽  
James V. Zidek
2021 ◽  
pp. 135481662110300
Author(s):  
Usamah F Alfarhan ◽  
Khaldoon Nusair ◽  
Hamed Al-Azri ◽  
Saeed Al-Muharrami ◽  
Nan Hua

Tourism expenditures are determined by a set of antecedents that reflect tourists’ willingness and ability to spend, and de facto incremental monetary outlays at which willingness and ability is transformed into total expenditures. Based on the neoclassical theoretical argument of utility-constrained expenditure minimization, we extend the current literature by applying a sustainability-based segmentation criterion, namely, the Legatum Prosperity IndexTM to the decomposition of a total expenditure differential into tourists’ relative willingness to spend and an upper bound of third-degree price discrimination, using mean-level and conditional quantile estimates. Our results indicate that understanding the price–quantity composition of international inbound tourism expenditure differentials assists agents in the tourism industry in their quest for profit maximization.


2021 ◽  
Author(s):  
Ilaria Prosdocimi ◽  
Thomas Kjeldsen

<p>The potential for changes in hydrometeorological extremes is routinely investigated by fitting change-permitting extreme value models to long-term observations, allowing one or more distribution parameters to change as a function of time or some physically-motivated covariate. In most practical extreme value analyses, the main quantity of interest though is the upper quantiles of the distribution, rather than the parameters' values. This study focuses on the changes in quantile estimates under different change-permitting models. First, metrics which measure the impact of changes in parameters on changes in quantiles are introduced. The mathematical structure of these change metrics is investigated for several models based on the Generalised Extreme Value (GEV) distribution. It is shown that for the most commonly used models, the predicted changes in the quantiles are a non-intuitive function of the distribution parameters, leading to results which are difficult to interpret. Next, it is posited that commonly used change-permitting GEV models do not preserve a constant coefficient of variation, a property that is typically assumed to hold and that is related to the scaling properties of extremes. To address these shortcomings a new (parsimonious) model is proposed: the model assumes a constant coefficient of variation, allowing the location and scale parameters to change simultaneously. The proposed model results in more interpretable changes in the quantile function. The consequences of the different modelling choices on quantile estimates are exemplified using a dataset of extreme peak river flow measurements.</p>


2016 ◽  
Vol 20 (12) ◽  
pp. 4717-4729 ◽  
Author(s):  
Martin Durocher ◽  
Fateh Chebana ◽  
Taha B. M. J. Ouarda

Abstract. This study investigates the utilization of hydrological information in regional flood frequency analysis (RFFA) to enforce desired properties for a group of gauged stations. Neighbourhoods are particular types of regions that are centred on target locations. A challenge for using neighbourhoods in RFFA is that hydrological information is not available at target locations and cannot be completely replaced by the available physiographical information. Instead of using the available physiographic characteristics to define the centre of a target location, this study proposes to introduce estimates of reference hydrological variables to ensure a better homogeneity. These reference variables represent nonlinear relations with the site characteristics obtained by projection pursuit regression, a nonparametric regression method. The resulting neighbourhoods are investigated in combination with commonly used regional models: the index-flood model and regression-based models. The complete approach is illustrated in a real-world case study with gauged sites from the southern part of the province of Québec, Canada, and is compared with the traditional approaches such as region of influence and canonical correlation analysis. The evaluation focuses on the neighbourhood properties as well as prediction performances, with special attention devoted to problematic stations. Results show clear improvements in neighbourhood definitions and quantile estimates.


1997 ◽  
Vol 33 (9) ◽  
pp. 2089-2096 ◽  
Author(s):  
T. A. Cohn ◽  
W. L. Lane ◽  
W. G. Baier
Keyword(s):  

2019 ◽  
Vol 20 (1) ◽  
pp. 106-123 ◽  
Author(s):  
Mustafizur Rahman ◽  
Md. Al-Hasan

This article undertakes an examination of Bangladesh’s latest available Quarterly Labour Force Survey 2015–2016 data to draw in-depth insights on gender wage gap and wage discrimination in Bangladesh labour market. The mean wage decomposition shows that on average a woman in Bangladesh earns 12.2 per cent lower wage than a man, and about half of the wage gap can be explained by labour market discrimination against women. Quantile counterfactual decomposition shows that women are subject to higher wage penalty at the lower deciles of the wage distribution with the wage gap varying between 8.3 per cent and 19.4 per cent at different deciles. We have found that at lower deciles, a significant part of the gender wage gap is on account of the relatively larger presence of informal employment. Conditional quantile estimates further reveal that formally employed female workers earn higher wage than their male counterparts at the first decile but suffer from wage penalty at the top deciles. JEL: C21, J31, J46, J70


2017 ◽  
Vol 17 (9) ◽  
pp. 1623-1629 ◽  
Author(s):  
Berry Boessenkool ◽  
Gerd Bürger ◽  
Maik Heistermann

Abstract. High precipitation quantiles tend to rise with temperature, following the so-called Clausius–Clapeyron (CC) scaling. It is often reported that the CC-scaling relation breaks down and even reverts for very high temperatures. In our study, we investigate this reversal using observational climate data from 142 stations across Germany. One of the suggested meteorological explanations for the breakdown is limited moisture supply. Here we argue that, instead, it could simply originate from undersampling. As rainfall frequency generally decreases with higher temperatures, rainfall intensities as dictated by CC scaling are less likely to be recorded than for moderate temperatures. Empirical quantiles are conventionally estimated from order statistics via various forms of plotting position formulas. They have in common that their largest representable return period is given by the sample size. In small samples, high quantiles are underestimated accordingly. The small-sample effect is weaker, or disappears completely, when using parametric quantile estimates from a generalized Pareto distribution (GPD) fitted with L moments. For those, we obtain quantiles of rainfall intensities that continue to rise with temperature.


Biometrika ◽  
2019 ◽  
Author(s):  
S Yang ◽  
K Pieper ◽  
F Cools

Summary Structural failure time models are causal models for estimating the effect of time-varying treatments on a survival outcome. G-estimation and artificial censoring have been proposed for estimating the model parameters in the presence of time-dependent confounding and administrative censoring. However, most existing methods require manually pre-processing data into regularly spaced data, which may invalidate the subsequent causal analysis. Moreover, the computation and inference are challenging due to the nonsmoothness of artificial censoring. We propose a class of continuous-time structural failure time models that respects the continuous-time nature of the underlying data processes. Under a martingale condition of no unmeasured confounding, we show that the model parameters are identifiable from a potentially infinite number of estimating equations. Using the semiparametric efficiency theory, we derive the first semiparametric doubly robust estimators, which are consistent if the model for the treatment process or the failure time model, but not necessarily both, is correctly specified. Moreover, we propose using inverse probability of censoring weighting to deal with dependent censoring. In contrast to artificial censoring, our weighting strategy does not introduce nonsmoothness in estimation and ensures that resampling methods can be used for inference.


Biometrics ◽  
2011 ◽  
Vol 68 (1) ◽  
pp. 275-286 ◽  
Author(s):  
Marshall M. Joffe ◽  
Wei Peter Yang ◽  
Harold Feldman
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document