scholarly journals A Maximum Entropy Model of Bounded Rational Decision-Making with Prior Beliefs and Market Feedback

Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 669
Author(s):  
Benjamin Patrick Evans ◽  
Mikhail Prokopenko

Bounded rationality is an important consideration stemming from the fact that agents often have limits on their processing abilities, making the assumption of perfect rationality inapplicable to many real tasks. We propose an information-theoretic approach to the inference of agent decisions under Smithian competition. The model explicitly captures the boundedness of agents (limited in their information-processing capacity) as the cost of information acquisition for expanding their prior beliefs. The expansion is measured as the Kullblack–Leibler divergence between posterior decisions and prior beliefs. When information acquisition is free, the homo economicus agent is recovered, while in cases when information acquisition becomes costly, agents instead revert to their prior beliefs. The maximum entropy principle is used to infer least biased decisions based upon the notion of Smithian competition formalised within the Quantal Response Statistical Equilibrium framework. The incorporation of prior beliefs into such a framework allowed us to systematically explore the effects of prior beliefs on decision-making in the presence of market feedback, as well as importantly adding a temporal interpretation to the framework. We verified the proposed model using Australian housing market data, showing how the incorporation of prior knowledge alters the resulting agent decisions. Specifically, it allowed for the separation of past beliefs and utility maximisation behaviour of the agent as well as the analysis into the evolution of agent beliefs.

2014 ◽  
Vol 1079-1080 ◽  
pp. 942-945
Author(s):  
Yi Bing Jiang ◽  
Xu Nan Gao ◽  
Ping Han

It’s very important for the traffic flow theory, the traffic management and control, and the programming and design of roads to master the law of speed. And the most intuitive form is the speed distribution to describe the law. Based on the maximum entropy principle, this paper proposes a model which can generate the speed distribution curves only with the observed date. And the availability of this method has been proved by the example analysis. The fact has also been found that fitting with maximum entropy distribution is better than with normal distribution in peak hour and is worse than it in off peak hour, but the difference is not great. This model does not need assumptions. Moreover it has a simple calculation and a strong practicability. So this paper makes a contribution to the study of the speed distribution.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Arian Ashourvan ◽  
Preya Shah ◽  
Adam Pines ◽  
Shi Gu ◽  
Christopher W. Lynn ◽  
...  

AbstractA major challenge in neuroscience is determining a quantitative relationship between the brain’s white matter structural connectivity and emergent activity. We seek to uncover the intrinsic relationship among brain regions fundamental to their functional activity by constructing a pairwise maximum entropy model (MEM) of the inter-ictal activation patterns of five patients with medically refractory epilepsy over an average of ~14 hours of band-passed intracranial EEG (iEEG) recordings per patient. We find that the pairwise MEM accurately predicts iEEG electrodes’ activation patterns’ probability and their pairwise correlations. We demonstrate that the estimated pairwise MEM’s interaction weights predict structural connectivity and its strength over several frequencies significantly beyond what is expected based solely on sampled regions’ distance in most patients. Together, the pairwise MEM offers a framework for explaining iEEG functional connectivity and provides insight into how the brain’s structural connectome gives rise to large-scale activation patterns by promoting co-activation between connected structures.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Sign in / Sign up

Export Citation Format

Share Document