A New Approach to Probability in Engineering Design and Optimization

1984 ◽  
Vol 106 (1) ◽  
pp. 5-10 ◽  
Author(s):  
J. N. Siddall

The anomalous position of probability and statistics in both mathematics and engineering is discussed, showing that there is little consensus on concepts and methods. For application in engineering design, probability is defined as strictly subjective in nature. It is argued that the use of classical methods of statistics to generate probability density functions by estimating parameters for assumed theoretical distributions should be used with caution, and that the use of confidence limits is not really meaningful in a design context. Preferred methods are described, and a new evolutionary technique for developing probability distributions of new random variables is proposed. Although Bayesian methods are commonly considered to be subjective, it is argued that, in the engineering sense, they are really not. A general formulation of the probabilistic optimization problem is described, including the role of subjective probability density functions.

Author(s):  
R. J. Eggert ◽  
R. W. Mayne

Abstract Probabilistic optimization using the moment matching method and the simulation optimization method are discussed and compared to conventional deterministic optimization. A new approach based on successively approximating probability density functions, using recursive quadratic programming for the optimization process, is described. This approach incorporates the speed and robustness of analytical probability density functions and improves accuracy by considering simulation results. Theoretical considerations and an example problem illustrate the features of the approach. The paper closes with a discussion of an objective function formulation which includes the expected cost of design constraint failure.


1984 ◽  
Vol 1 (19) ◽  
pp. 8
Author(s):  
Yoshito Tsuchiya ◽  
Yoshiaki Kawata

The objective of this paper is to propose a new approach based on the direction of the typhoon track for determining the probability of occurrence of extremal tides due to storm surges, as well as their return period. The study includes the effects of periods in tidal data and of tidal variation stemming from extensive reclamation along coasts on the fitness of the extremal data to probability density functions. The method is justified by application to an analysis of the probability of occurrence of storm surges in Osaka bay.


2012 ◽  
Vol 15 (07) ◽  
pp. 1250047 ◽  
Author(s):  
CAROLE BERNARD ◽  
ZHENYU CUI ◽  
DON MCLEISH

This paper presents a new approach to perform a nearly unbiased simulation using inversion of the characteristic function. As an application we are able to give unbiased estimates of the price of forward starting options in the Heston model and of continuously monitored Parisian options in the Black-Scholes framework. This method of simulation can be applied to problems for which the characteristic functions are easily evaluated but the corresponding probability density functions are complicated.


1993 ◽  
Vol 115 (3) ◽  
pp. 385-391 ◽  
Author(s):  
R. J. Eggert ◽  
R. W. Mayne

Probabilistic optimization using the moment matching method and the simulation optimization method are discussed and compared to conventional deterministic optimization. A new approach based on successively approximating probability density functions, using recursive quadratic programming for the optimization process, is described. This approach incorporates the speed and robustness of analytical probability density functions and improves accuracy by considering simulation results. Theoretical considerations and an example problem illustrate the features of the approach. The paper closes with a discussion of an objective function formulation which includes the expected cost of design constraint failure.


Author(s):  
Pedro Zuidberg Dos Martires ◽  
Anton Dries ◽  
Luc De Raedt

Weighted model counting has recently been extended to weighted model integration, which can be used to solve hybrid probabilistic reasoning problems. Such problems involve both discrete and continuous probability distributions. We show how standard knowledge compilation techniques (to SDDs and d-DNNFs) apply to weighted model integration, and use it in two novel solvers, one exact and one approximate solver. Furthermore, we extend the class of employable weight functions to actual probability density functions instead of mere polynomial weight functions.


2019 ◽  
Author(s):  
Guillaume Etter ◽  
Frederic Manseau ◽  
Sylvain Williams

AbstractUnderstanding the role of neuronal activity in cognition and behavior is a key question in neuroscience. Previously, in vivo studies have typically inferred behavior from electrophysiological data using probabilistic approaches including Bayesian decoding. While providing useful information on the role of neuronal subcircuits, electrophysiological approaches are often limited in the maximum number of recorded neurons as well as their ability to reliably identify neurons over time. This can be particularly problematic when trying to decode behaviors that rely on large neuronal assemblies or rely on temporal mechanisms, such as a learning task over the course of several days. Calcium imaging of genetically encoded calcium indicators has overcome these two issues. Unfortunately, because calcium transients only indirectly reflect spiking activity and calcium imaging is often performed at lower sampling frequencies, this approach suffers from uncertainty in exact spike timing and thus activity frequency, making rate-based decoding approaches used in electrophysiological recordings difficult to apply to calcium imaging data. Here we describe a probabilistic framework that can be used to robustly infer behavior from calcium imaging recordings and relies on a simplified implementation of a naive Baysian classifier. Our method discriminates between periods of activity and periods of inactivity to compute probability density functions (likelihood and posterior), significance and confidence interval, as well as mutual information. We next devise a simple method to decode behavior using these probability density functions and propose metrics to quantify decoding accuracy. Finally, we show that neuronal activity can be predicted from behavior, and that the accuracy of such reconstructions can guide the understanding of relationships that may exist between behavioral states and neuronal activity.


2020 ◽  
Vol 14 ◽  
Author(s):  
Guillaume Etter ◽  
Frederic Manseau ◽  
Sylvain Williams

Understanding the role of neuronal activity in cognition and behavior is a key question in neuroscience. Previously, in vivo studies have typically inferred behavior from electrophysiological data using probabilistic approaches including Bayesian decoding. While providing useful information on the role of neuronal subcircuits, electrophysiological approaches are often limited in the maximum number of recorded neurons as well as their ability to reliably identify neurons over time. This can be particularly problematic when trying to decode behaviors that rely on large neuronal assemblies or rely on temporal mechanisms, such as a learning task over the course of several days. Calcium imaging of genetically encoded calcium indicators has overcome these two issues. Unfortunately, because calcium transients only indirectly reflect spiking activity and calcium imaging is often performed at lower sampling frequencies, this approach suffers from uncertainty in exact spike timing and thus activity frequency, making rate-based decoding approaches used in electrophysiological recordings difficult to apply to calcium imaging data. Here we describe a probabilistic framework that can be used to robustly infer behavior from calcium imaging recordings and relies on a simplified implementation of a naive Baysian classifier. Our method discriminates between periods of activity and periods of inactivity to compute probability density functions (likelihood and posterior), significance and confidence interval, as well as mutual information. We next devise a simple method to decode behavior using these probability density functions and propose metrics to quantify decoding accuracy. Finally, we show that neuronal activity can be predicted from behavior, and that the accuracy of such reconstructions can guide the understanding of relationships that may exist between behavioral states and neuronal activity.


2014 ◽  
Vol 535 ◽  
pp. 145-148
Author(s):  
Jeeng Min Ling ◽  
Kunkerati Lublertlop

In this paper, the Weibull, Gamma, Lognormal, Rayleigh probability density functions (PDF) were used to statistically analyze the characteristics of wind speed and evaluate the energy based on hourly records from years of 2004 to 2009 at 24 locations in Taiwan. Weibull model shows the best goodness probability density function for estimating behavior of wind characteristic within six years at 7 sites of weather station better than using the Gamma and Rayleigh model. The annual mean wind power density is estimated and compared by different index. The feasibility of probability distributions at different locations were investigated.


2012 ◽  
Vol 3 (3) ◽  
pp. 13-31 ◽  
Author(s):  
George Diemer

Recent research has examined a statistical phenomenon sometimes associated with point shaving in the NCAA Basketball gambling market.  In a similar fashion, this study examines the NFL gambling market based on 3,641 games over the years 1993-2007.   This research uses a new approach: a bootstrap hypothesis test of equality in the game outcome distributions for large favorites vs. small favorites.  Probability density functions of large favorites vs. small favorites are constructed and compared.  The results are consistent with studies that suggest point shaving exists, and they counter recent claims that the statistical phenomenon are of a more innocent nature.  This research leads to the rejection of the null of no point shaving in the NFL point spread gambling market.


Sign in / Sign up

Export Citation Format

Share Document