Probability of Fade: Lognormal Model

2009 ◽  
pp. 63-63
Keyword(s):  
Author(s):  
Réjean Plamondon ◽  
Wacef Guerfali ◽  
Xiaolin Li

We recently developed a general theory of rapid human movements and applied it to the generation of western handwriting. The goal of this paper is to summarize the key concepts behind the so-called vectorial delta-lognormal model that results from this theory and to show how this model could be used for Chinese character analysis and processing.


Author(s):  
Shewkar Ibrahim ◽  
Tarek Sayed

Enforcement agencies generally operate under a strict budget and with limited resources. For this reason, they are continually searching for new approaches to maximize the efficiency and effectiveness of their deployment. The Data-Driven Approaches to Crime and Traffic Safety approach attempts to identify opportunities where increased visibility of traffic enforcement can lead to a reduction in collision frequencies as well as criminal incidents. Previous research developed functions to model collisions and crime separately, despite evidence suggesting that the two events could be correlated. Additionally, there is little knowledge of the implications of automated enforcement programs on crime. This study developed a Multivariate Poisson-Lognormal model for the city of Edmonton to quantify the correlation between collisions and crime and to determine whether automated enforcement programs can also reduce crime within a neighborhood. The results of this study found a high correlation between collisions and crime of 0.72 which indicates that collision hotspots were also likely to be crime hotspots. The results of this paper also showed that increased enforcement presence resulted in reductions not only in collisions but also in crime. If a single deployment can achieve multiple objectives (e.g., reducing crime and collisions), then optimizing an agency’s deployment strategy would decrease the demand on their resources and allow them to achieve more with less.


Author(s):  
Jerome Lavoue ◽  
Igor Burstyn

Abstract Objectives Workplace exposure measurements typically contain some observations below limit of detection. The current paradigm for exposure data interpretation relies on the lognormal distribution, where censored observation are assumed to be present but not quantifiable. However, there are setting were such assumptions are untenable and true zero exposures cannot be ruled out. This issue can be non-trivial because decisions about compliance depend on the adequacy of the lognormal model. Methods We adapted previously described statistical models for mixture of true zeros and lognormal distribution to function within Bayesian procedure that overcomes historical limitations that precluded them from being used in practice. We compared the performance of the new models and the traditional lognormal model in simulation. Their implementation is illustrated in diverse datasets. Results The approach we propose involves estimating the proportion of true zeroes, and the geometric mean and standard deviation of the lognormal component of the mixture. This can be implemented in practice either based on the truncated lognormal model fit to the observed data, or on the censored Bernoulli-lognormal mixture model, which has the advantage of allowing for multiple censoring points. Both models can be implemented via a free online application. In simulations, when none of the censored values were zeros, all estimation procedures led to similar risk assessment. However, when all or most of the censored values were zeros, the traditional approach that assumes lognormal distribution performed noticeably worse than newly proposed methods, typically overestimating noncompliance. Application to real data suggests that we cannot rule out presence of true zero exposures in typical measurement series gathered by occupational hygienists. Conclusions Forcing the usual lognormal model to data containing a large proportion of censored values can bias risk assessment if a substantial part of the censored points are true zeroes. The Bernoulli-lognormal model is a suitable and accessible model that can account for such challenging data, and leads to unbiased risk assessments regardless of the presence of true zeros in the data.


AIHAJ ◽  
1991 ◽  
Vol 52 (11) ◽  
pp. 493-502 ◽  
Author(s):  
Martha A. Waters ◽  
Steve Selvin ◽  
Stephen M. Rappaport

Sign in / Sign up

Export Citation Format

Share Document