MAXIMUM ENTROPY PRINCIPLE IN THE DIFFERENTIAL SECOND-ORDER REALIZATION OF A NON-STATIONARY BILINEAR SYSTEM

2019 ◽  
Vol 20 (2) ◽  
pp. 223-248
Author(s):  
V. A. Rusanov ◽  
A. V. Banshchikov ◽  
A. V. Daneev ◽  
A. V. Lakeyev
2015 ◽  
Vol 17 (2) ◽  
pp. 371-400 ◽  
Author(s):  
Roman Pascal Schaerer ◽  
Manuel Torrilhon

AbstractMoment equations provide a flexible framework for the approximation of the Boltzmann equation in kinetic gas theory. While moments up to second order are sufficient for the description of equilibrium processes, the inclusion of higher order moments, such as the heat flux vector, extends the validity of the Euler equations to non-equilibrium gas flows in a natural way.Unfortunately, the classical closure theory proposed by Grad leads to moment equations, which suffer not only from a restricted hyperbolicity region but are also affected by non-physical sub-shocks in the continuous shock-structure problem if the shock velocity exceeds a critical value. Amore recently suggested closure theory based on the maximum entropy principle yields symmetric hyperbolic moment equations. However, if moments higher than second order are included, the computational demand of this closure can be overwhelming. Additionally, it was shown for the 5-moment system that the closing flux becomes singular on a subset of moments including the equilibrium state.Motivated by recent promising results of closed-form, singular closures based on the maximum entropy approach, we study regularized singular closures that become singular on a subset of moments when the regularizing terms are removed. In order to study some implications of singular closures, we use a recently proposed explicit closure for the 5-moment equations. We show that this closure theory results in a hyperbolic system that can mitigate the problem of sub-shocks independent of the shock wave velocity and handle strongly non-equilibrium gas flows.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1307
Author(s):  
Mauricio A. Valle ◽  
Jaime F. Lavín ◽  
Nicolás S. Magner

The financial market is a complex system in which the assets influence each other, causing, among other factors, price interactions and co-movement of returns. Using the Maximum Entropy Principle approach, we analyze the interactions between a selected set of stock assets and equity indices under different high and low return volatility episodes at the 2008 Subprime Crisis and the 2020 Covid-19 outbreak. We carry out an inference process to identify the interactions, in which we implement the a pairwise Ising distribution model describing the first and second moments of the distribution of the discretized returns of each asset. Our results indicate that second-order interactions explain more than 80% of the entropy in the system during the Subprime Crisis and slightly higher than 50% during the Covid-19 outbreak independently of the period of high or low volatility analyzed. The evidence shows that during these periods, slight changes in the second-order interactions are enough to induce large changes in assets correlations but the proportion of positive and negative interactions remains virtually unchanged. Although some interactions change signs, the proportion of these changes are the same period to period, which keeps the system in a ferromagnetic state. These results are similar even when analyzing triadic structures in the signed network of couplings.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document