scholarly journals Bayesian Rapid Optimal Adaptive Design (BROAD): Method and application distinguishing models of risky choice

2019 ◽  
Author(s):  
Debajyoti Ray ◽  
Daniel Golovin ◽  
Andreas Krause ◽  
Colin Camerer

Economic surveys and experiments usually present fixed questions to respondents. Rapid computation now allows adaptively optimized questions, based on previous responses, to maximize expected information. We describe a novel method of this type introduced in computer science, and apply it experimentally to six theories of risky choice. The EC2 method creates equivalence classes, each consisting of a true theory and its noisy-response perturbations, and chooses questions with the goal of distinguishing between equivalence classes by cutting edges connecting them. The edge-cutting information measure is adaptively submodular, which enables a provable performance bound and “lazy” evaluation which saves computation. The experimental data show that most subjects, making only 30 choices, can be reliably classified as choosing according to EV or two variants of prospect theory. We also show that it is difficult for subjects to manipulate by misreporting preferences, and find no evidence of manipulation.

2018 ◽  
Vol 141 (5) ◽  
Author(s):  
Yeshaswini Emmi ◽  
Andreas Fiolitakis ◽  
Manfred Aigner ◽  
Franklin Genin ◽  
Khawar Syed

A new model approach is presented in this work for including convective wall heat losses in the direct quadrature method of moments (DQMoM) approach, which is used here to solve the transport equation of the one-point, one-time joint thermochemical probability density function (PDF). This is of particular interest in the context of designing industrial combustors, where wall heat losses play a crucial role. In the present work, the novel method is derived for the first time and validated against experimental data for the thermal entrance region of a pipe. The impact of varying model-specific boundary conditions is analyzed. It is then used to simulate the turbulent reacting flow of a confined methane jet flame. The simulations are carried out using the DLR in-house computational fluid dynamics code THETA. It is found that the DQMoM approach presented here agrees well with the experimental data and ratifies the use of the new convective wall heat losses model.


Author(s):  
Paul Erickson ◽  
David Grupp

A novel method of using a liquid phase oxidizer in fuel cell applications has been discovered by researchers at UC Davis. This paper outlines potential implications for improving heat transfer and catalytic activity with this method. Experimental data have been collected and the results show that the proposed method of using liquid phase oxidizer does indeed allow operation of PEM fuel cell systems. Data indicate an improvement in overvoltage at low current but also clearly indicate a severely limited concentration polarization region with non-regenerated fluid. The preliminary data indicate the physical feasibility of the method but also show that more research and development is required.


2015 ◽  
Vol 3 (1-2) ◽  
pp. 52-87 ◽  
Author(s):  
Nori Jacoby ◽  
Naftali Tishby ◽  
Bruno H. Repp ◽  
Merav Ahissar ◽  
Peter E. Keller

Linear models have been used in several contexts to study the mechanisms that underpin sensorimotor synchronization. Given that their parameters are often linked to psychological processes such as phase correction and period correction, the fit of the parameters to experimental data is an important practical question. We present a unified method for parameter estimation of linear sensorimotor synchronization models that extends available techniques and enhances their usability. This method enables reliable and efficient analysis of experimental data for single subject and multi-person synchronization. In a previous paper (Jacoby et al., 2015), we showed how to significantly reduce the estimation error and eliminate the bias of parameter estimation methods by adding a simple and empirically justified constraint on the parameter space. By applying this constraint in conjunction with the tools of matrix algebra, we here develop a novel method for estimating the parameters of most linear models described in the literature. Through extensive simulations, we demonstrate that our method reliably and efficiently recovers the parameters of two influential linear models: Vorberg and Wing (1996), and Schulze et al. (2005), together with their multi-person generalization to ensemble synchronization. We discuss how our method can be applied to include the study of individual differences in sensorimotor synchronization ability, for example, in clinical populations and ensemble musicians.


1988 ◽  
Vol 82 (3) ◽  
pp. 719-736 ◽  
Author(s):  
George A. Quattrone ◽  
Amos Tversky

We contrast the rational theory of choice in the form of expected utility theory with descriptive psychological analysis in the form of prospect theory, using problems involving the choice between political candidates and public referendum issues. The results showed that the assumptions underlying the classical theory of risky choice are systematically violated in the manner predicted by prospect theory. In particular, our respondents exhibited risk aversion in the domain of gains, risk seeking in the domain of losses, and a greater sensitivity to losses than to gains. This is consistent with the advantage of the incumbent under normal conditions and the potential advantage of the challenger in bad times. The results further show how a shift in the reference point could lead to reversals of preferences in the evaluation of political and economic options, contrary to the assumption of invariance. Finally, we contrast the normative and descriptive analyses of uncertainty in choice and address the rationality of voting.


2019 ◽  
Author(s):  
Sangil Lee ◽  
Chris M. Glaze ◽  
Eric T. Bradlow ◽  
Joe Kable

In intertemporal and risky choice decisions, parametric utility models are widely used for predicting choice and measuring individuals’ impulsivity and risk aversion. However, parametric utility models cannot describe data deviating from their assumed functional form. We propose a novel method using Cubic Bezier Splines (CBS) to flexibly model smooth and monotonic utility functions that can be fit to any dataset. CBS shows higher descriptive and predictive accuracy over extant parametric models and can identify common yet novel patterns of behavior previously unaccounted for. Furthermore, CBS provides measures of impulsivity and risk aversion that do not depend on parametric model assumptions.


2021 ◽  
Author(s):  
Emmanuelle Blanc ◽  
Jérôme Enjalbert ◽  
Pierre Barbillon

- Background and Aims Functional-structural plant models are increasingly being used by plant scientists to address a wide variety of questions. However, the calibration of these complex models is often challenging, mainly because of their high computational cost. In this paper, we applied an automatic method to the calibration of WALTer: a functional-structural wheat model that simulates the plasticity of tillering in response to competition for light. - Methods We used a Bayesian calibration method to estimate the values of 5 parameters of the WALTer model by fitting the model outputs to tillering dynamics data. The method presented in this paper is based on the Efficient Global Optimisation algorithm. It involves the use of Gaussian process metamodels to generate fast approximations of the model outputs. To account for the uncertainty associated with the metamodels approximations, an adaptive design was used. The efficacy of the method was first assessed using simulated data. The calibration was then applied to experimental data. - Key Results The method presented here performed well on both simulated and experimental data. In particular, the use of an adaptive design proved to be a very efficient method to improve the quality of the metamodels predictions, especially by reducing the uncertainty in areas of the parameter space that were of interest for the fitting. Moreover, we showed the necessity to have a diversity of field data in order to be able to calibrate the parameters. - Conclusions The method presented in this paper, based on an adaptive design and Gaussian process metamodels, is an efficient approach for the calibration of WALTer and could be of interest for the calibration of other functional-structural plant models .


2021 ◽  
Author(s):  
Lisheng He ◽  
Pantelis P. Analytis ◽  
Sudeep Bhatia

A wide body of empirical research has revealed the descriptive shortcomings of expected value and expected utility models of risky decision making. In response, numerous models have been advanced to predict and explain people’s choices between gambles. Although some of these models have had a great impact in the behavioral, social and management sciences, there is little consensus about which model offers the best account of choice behavior. In this paper, we conduct a large-scale comparison of 58 prominent models of risky choice, using 19 existing behavioral datasets involving more than 800 participants. This allows us to comprehensively evaluate models in terms of individual-level predictive performance across a range of different choice settings. We also identify the psychological mechanisms that lead to superior predictive performance and the properties of choice stimuli that favor certain types of models over others. Second, drawing on research on the wisdom of crowds, we argue that each of the existing models can be seen as an expert that provides unique forecasts in choice predictions. Consistent with this claim, we find that crowds of risky choice models perform better than individual models and thus provide a performance bound for assessing the historical accumulation of knowledge in our field. Our results suggest that each model captures unique aspects of the decision process, and that existing risky choice models offer complementary rather than competing accounts of behavior. We discuss the implications of our results on theories of risky decision making and the quantitative modeling of choice behavior.


Cliometrica ◽  
2021 ◽  
Author(s):  
Toke S. Aidt ◽  
Stanley L. Winer ◽  
Peng Zhang

AbstractThe Redistribution Hypothesis predicts that franchise extension causes an increase in state-sponsored redistribution. We test this hypothesis by considering the relationship between franchise extension and selected aspects of fiscal structure at both central and local government levels in the UK from 1820 to 1913. We do so without imposing a priori restrictions on the direction of causality using a novel method for causal investigation of non-experimental data proposed by Hoover (2001). This method is based on tests for structural breaks in the conditional and marginal distributions of the franchise and fiscal structure time series preceded by a detailed historical narrative analysis. We do not find compelling evidence supporting the Redistribution Hypothesis.


2020 ◽  
Vol 2020 ◽  
pp. 1-16
Author(s):  
Fan Jia ◽  
Xingyuan Wang

Multicriteria group decision-making (MCGDM) problems have been a research hotspot in recent years, and prospect theory is introduced to cope with the risk and imprecision in the process of decision-making. To guarantee the effectiveness of information aggregation and extend the feasibility of prospect theory, this paper proposes a novel decision-making approach based on rough numbers and prospect theory to solve risky and uncertain MCGDM problems. Firstly by combining rough numbers and the best-worst method (BWM), we construct a linear programming model to calculate rough criteria weights, which are defined by lower limitations and upper limitations. Then for the imprecision of value function and weighting function in prospect theory, we propose a novel method with the aid of combining rough numbers and prospect theory to handle the risk in decision-making problems. Finally, a numerical example involving investment is introduced to illustrate the application and validity of the proposed method.


2021 ◽  
Author(s):  
Lisheng He ◽  
Pantelis P. Analytis ◽  
Sudeep Bhatia

A wide body of empirical research has revealed the descriptive shortcomings of expected value and expected utility models of risky decision making. In response, numerous models have been advanced to predict and explain people’s choices between gambles. Although some of these models have had a great impact in the behavioral, social, and management sciences, there is little consensus about which model offers the best account of choice behavior. In this paper, we conduct a large-scale comparison of 58 prominent models of risky choice, using 19 existing behavioral data sets involving more than 800 participants. This allows us to comprehensively evaluate models in terms of individual-level predictive performance across a range of different choice settings. We also identify the psychological mechanisms that lead to superior predictive performance and the properties of choice stimuli that favor certain types of models over others. Moreover, drawing on research on the wisdom of crowds, we argue that each of the existing models can be seen as an expert that provides unique forecasts in choice predictions. Consistent with this claim, we find that crowds of risky choice models perform better than individual models and thus provide a performance bound for assessing the historical accumulation of knowledge in our field. Our results suggest that each model captures unique aspects of the decision process and that existing risky choice models offer complementary rather than competing accounts of behavior. We discuss the implications of our results on theories of risky decision making and the quantitative modeling of choice behavior. This paper was accepted by Yuval Rottenstreich, behavioral economics and decision analysis.


Sign in / Sign up

Export Citation Format

Share Document