schur concave
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 0)

H-INDEX

7
(FIVE YEARS 0)

10.37236/7250 ◽  
2017 ◽  
Vol 24 (4) ◽  
Author(s):  
Evan Chen

For integers $a_1, \dots, a_n \ge 0$ and $k \ge 1$, let $\mathcal L_{k+2}(a_1,\dots, a_n)$ denote the set of permutations of $\{1, \dots, a_1+\dots+a_n\}$ whose descent set is contained in $\{a_1, a_1+a_2, \dots, a_1+\dots+a_{n-1}\}$, and which avoids the pattern $12\dots(k+2)$. We exhibit some bijections between such sets, most notably showing that $\# \mathcal L_{k+2} (a_1, \dots, a_n)$ is symmetric in the $a_i$ and is in fact Schur-concave. This generalizes a set of equivalences observed by Mei and Wang.


2017 ◽  
Vol 17 (1) ◽  
Author(s):  
Hautahi Kingi

AbstractI analyze the welfare effects of a policy of modern sector enlargement (MSENL), and a policy of increasing the efficiency of on-the-job search from the urban informal sector (IEOS) in a generalized Harris-Todaro model. I show that MSENL causes a Lorenz worsening of the income distribution and IEOS causes a Lorenz improvement. In a rare direct application of the Atkinson theorem, I conclude that MSENL decreases social welfare and IEOS increases social welfare for all anonymous, increasing and Schur-concave social welfare functions.


2012 ◽  
Vol 21 (2) ◽  
pp. 227-234
Author(s):  
YU-DONG WU ◽  
◽  
ZHI-HUA ZHANG ◽  
ZHI-GANG WANG ◽  
◽  
...  

In this paper, by using the theories of Schur-concave function and classical analysis, we give a class of analytic inequalities which settle affirmatively to several open problems posed by Sun.


2003 ◽  
Vol 15 (2) ◽  
pp. 349-396 ◽  
Author(s):  
Kenneth Kreutz-Delgado ◽  
Joseph F. Murray ◽  
Bhaskar D. Rao ◽  
Kjersti Engan ◽  
Te-Won Lee ◽  
...  

Algorithms for data-driven learning of domain-specific overcomplete dictionaries are developed to obtain maximum likelihood and maximum a posteriori dictionary estimates based on the use of Bayesian models with concave/Schur-concave (CSC) negative log priors. Such priors are appropriate for obtaining sparse representations of environmental signals within an appropriately chosen (environmentally matched) dictionary. The elements of the dictionary can be interpreted as concepts, features, or words capable of succinct expression of events encountered in the environment (the source of the measured signals). This is a generalization of vector quantization in that one is interested in a description involving a few dictionary entries (the proverbial “25 words or less”), but not necessarily as succinct as one entry. To learn an environmentally adapted dictionary capable of concise expression of signals generated by the environment, we develop algorithms that iterate between a representative set of sparse representations found by variants of FOCUSS and an update of the dictionary using these sparse representations. Experiments were performed using synthetic data and natural images. For complete dictionaries, we demonstrate that our algorithms have improved performance over other independent component analysis (ICA) methods, measured in terms of signal-to-noise ratios of separated sources. In the overcomplete case, we show that the true underlying dictionary and sparse sources can be accurately recovered. In tests with natural images, learned overcomplete dictionaries are shown to have higher coding efficiency than complete dictionaries; that is, images encoded with an overcomplete dictionary have both higher compression (fewer bits per pixel) and higher accuracy (lower mean square error).


Sign in / Sign up

Export Citation Format

Share Document