scholarly journals Conditional independence between two variables given any conditioning subset implies block diagonal covariance matrix for multivariate Gaussian distributions

2008 ◽  
Vol 78 (13) ◽  
pp. 1922-1928
Author(s):  
Guillaume Marrelec ◽  
Habib Benali
2016 ◽  
Author(s):  
Osama Ashfaq

Li (ICCV, 2005) proposed a novel generative/discriminative way to combine features with different types and use them to learn labels in the images. However, the mixture of Gaussian used in Li’s paper suffers greatly from the curse of dimensionality. Here I propose an alternative approach to generate local region descriptor. I treat GMM with diagonal covariance matrix and PCA as separate features, and combine them as the local descriptor. In this way, we could reduce the computational time for mixture model greatly while score greater 90% accuracies for caltech-4 image sets.


2021 ◽  
Author(s):  
Thomas Muschinski ◽  
Georg J. Mayr ◽  
Thorsten Simon ◽  
Achim Zeileis

<p>To obtain reliable joint probability forecasts, multivariate postprocessing of numerical weather predictions (NWPs) must take into account dependencies among the univariate forecast errors—across different forecast horizons, locations or atmospheric quantities. We develop a framework for multivariate Gaussian regression (MGR), a flexible multivariate postprocessing technique with advantages over state-of-the-art methods.</p><p>In MGR both mean forecasts and parameters describing their error covariance matrix may be modeled simultaneously on NWP-derived predictor variables. The bivariate case is straightforward and has been used to postprocess horizontal wind vector forecasts, but higher dimensions present two major difficulties: ensuring the estimated error covariance matrix is positive definite and regularizing the high model complexity.</p><p>We tackle these problems by parameterizing the covariance through the entries of its basic and modified Cholesky decompositions. This ensures its positive definiteness and is the crucial fact making it possible to link parameters with predictors in a regression.  When there is a natural order to the variables, we can also sensibly reduce complexity through a priori restrictions of the parameter space.</p><p>MGR forecasts take the form of full joint parametric distributions—in contrast to ensemble copula coupling (ECC) that obtains samples from the joint distribution. This has the advantage that joint probabilities or quantiles can be easily derived.</p><p>Our novel method is applied to postprocess NWPs of surface temperature at an Alpine valley station for ten distinct lead times more than one week in the future.  All the mean forecasts and their full error covariance matrix are modelled on NWP-derived variables in one step. MGR outperforms ECC in combination with nonhomogeneous Gaussian regression.</p>


Sign in / Sign up

Export Citation Format

Share Document