Possible entropy functions

2003 ◽  
Vol 135 (1) ◽  
pp. 221-250 ◽  
Author(s):  
Tomasz Downarowicz ◽  
Jacek Serafin
Keyword(s):  
2016 ◽  
Vol 11 (2) ◽  
pp. 205-209
Author(s):  
D.T. Siraeva

Invariant submodel of rank 2 on the subalgebra consisting of the sum of transfers for hydrodynamic equations with the equation of state in the form of pressure as the sum of density and entropy functions, is presented. In terms of the Lagrangian coordinates from condition of nonhyperbolic submodel solutions depending on the four essential constants are obtained. For simplicity, we consider the solution depending on two constants. The trajectory of particles motion, the motion of parallelepiped of the same particles are studied using the Maple.


2018 ◽  
Vol 13 (3) ◽  
pp. 59-63 ◽  
Author(s):  
D.T. Siraeva

Equations of hydrodynamic type with the equation of state in the form of pressure separated into a sum of density and entropy functions are considered. Such a system of equations admits a twelve-dimensional Lie algebra. In the case of the equation of state of the general form, the equations of gas dynamics admit an eleven-dimensional Lie algebra. For both Lie algebras the optimal systems of non-similar subalgebras are constructed. In this paper two partially invariant submodels of rank 3 defect 1 are constructed for two-dimensional subalgebras of the twelve-dimensional Lie algebra. The reduction of the constructed submodels to invariant submodels of eleven-dimensional and twelve-dimensional Lie algebras is proved.


Electronics ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 657
Author(s):  
Krzysztof Gajowniczek ◽  
Tomasz Ząbkowski

This paper presents two R packages ImbTreeEntropy and ImbTreeAUC to handle imbalanced data problems. ImbTreeEntropy functionality includes application of a generalized entropy functions, such as Rényi, Tsallis, Sharma–Mittal, Sharma–Taneja and Kapur, to measure impurity of a node. ImbTreeAUC provides non-standard measures to choose an optimal split point for an attribute (as well the optimal attribute for splitting) by employing local, semi-global and global AUC (Area Under the ROC curve) measures. Both packages are applicable for binary and multiclass problems and they support cost-sensitive learning, by defining a misclassification cost matrix, and weighted-sensitive learning. The packages accept all types of attributes, including continuous, ordered and nominal, where the latter type is simplified for multiclass problems to reduce the computational overheads. Both applications enable optimization of the thresholds where posterior probabilities determine final class labels in a way that misclassification costs are minimized. Model overfitting can be managed either during the growing phase or at the end using post-pruning. The packages are mainly implemented in R, however some computationally demanding functions are written in plain C++. In order to speed up learning time, parallel processing is supported as well.


1995 ◽  
Vol 138 (1-3) ◽  
pp. 319-326
Author(s):  
A. Meir ◽  
J.W. Moon

1992 ◽  
Vol 64 (1-2) ◽  
pp. 143-148
Author(s):  
M. Behara ◽  
Z. Dudek
Keyword(s):  

2020 ◽  
Vol 34 (04) ◽  
pp. 5742-5749
Author(s):  
Xiaoshuang Shi ◽  
Fuyong Xing ◽  
Yuanpu Xie ◽  
Zizhao Zhang ◽  
Lei Cui ◽  
...  

Although attention mechanisms have been widely used in deep learning for many tasks, they are rarely utilized to solve multiple instance learning (MIL) problems, where only a general category label is given for multiple instances contained in one bag. Additionally, previous deep MIL methods firstly utilize the attention mechanism to learn instance weights and then employ a fully connected layer to predict the bag label, so that the bag prediction is largely determined by the effectiveness of learned instance weights. To alleviate this issue, in this paper, we propose a novel loss based attention mechanism, which simultaneously learns instance weights and predictions, and bag predictions for deep multiple instance learning. Specifically, it calculates instance weights based on the loss function, e.g. softmax+cross-entropy, and shares the parameters with the fully connected layer, which is to predict instance and bag predictions. Additionally, a regularization term consisting of learned weights and cross-entropy functions is utilized to boost the recall of instances, and a consistency cost is used to smooth the training process of neural networks for boosting the model generalization performance. Extensive experiments on multiple types of benchmark databases demonstrate that the proposed attention mechanism is a general, effective and efficient framework, which can achieve superior bag and image classification performance over other state-of-the-art MIL methods, with obtaining higher instance precision and recall than previous attention mechanisms. Source codes are available on https://github.com/xsshi2015/Loss-Attention.


Sign in / Sign up

Export Citation Format

Share Document