scholarly journals Nonparametric Bayesian Learning of Infinite Multivariate Generalized Normal Mixture Models and Its Applications

2021 ◽  
Vol 11 (13) ◽  
pp. 5798
Author(s):  
Sami Bourouis ◽  
Roobaea Alroobaea ◽  
Saeed Rubaiee ◽  
Murad Andejany ◽  
Nizar Bouguila

This paper addresses the problem of data vectors modeling, classification and recognition using infinite mixture models, which have been shown to be an effective alternative to finite mixtures in terms of selecting the optimal number of clusters. In this work, we propose a novel approach for localized features modelling using an infinite mixture model based on multivariate generalized Normal distributions (inMGNM). The statistical mixture is learned via a nonparametric MCMC-based Bayesian approach in order to avoid the crucial problem of model over-fitting and to allow uncertainty in the number of mixture components. Robust descriptors are derived from encoding features with the Fisher vector method, which considers higher order statistics. These descriptors are combined with a linear support vector machine classifier in order to achieve higher accuracy. The efficiency and merits of the proposed nonparametric Bayesian learning approach, while comparing it to other different methods, are demonstrated via two challenging applications, namely texture classification and human activity categorization.

2012 ◽  
Vol 56 (8) ◽  
pp. 2454-2470 ◽  
Author(s):  
Byungtae Seo ◽  
Daeyoung Kim

Biometrika ◽  
1992 ◽  
Vol 79 (4) ◽  
pp. 842-846 ◽  
Author(s):  
BRUNO GOFFINET ◽  
PATRICE LOISEL ◽  
BEATRICE LAURENT

Entropy ◽  
2016 ◽  
Vol 18 (11) ◽  
pp. 382 ◽  
Author(s):  
Javier Contreras-Reyes ◽  
Daniel Cortés

Mixture models are in high demand for machine-learning analysis due to their computational tractability, and because they serve as a good approximation for continuous densities. Predominantly, entropy applications have been developed in the context of a mixture of normal densities. In this paper, we consider a novel class of skew-normal mixture models, whose components capture skewness due to their flexibility. We find upper and lower bounds for Shannon and Rényi entropies for this model. Using such a pair of bounds, a confidence interval for the approximate entropy value can be calculated. In addition, an asymptotic expression for Rényi entropy by Stirling’s approximation is given, and upper and lower bounds are reported using multinomial coefficients and some properties and inequalities of L p metric spaces. Simulation studies are then applied to a swordfish (Xiphias gladius Linnaeus) length dataset.


Sign in / Sign up

Export Citation Format

Share Document