Semantic Auto-Encoder with L2-norm Constraint for Zero-Shot Learning

Author(s):  
Yuhao Wu ◽  
Weipeng Cao ◽  
Ye Liu ◽  
Zhong Ming ◽  
Jianqiang Li ◽  
...  
Keyword(s):  
L2 Norm ◽  
2019 ◽  
Vol 38 (7) ◽  
pp. 3211-3226 ◽  
Author(s):  
Zuyuan Yang ◽  
Yifei Hu ◽  
Naiyao Liang ◽  
Jun Lv

2019 ◽  
Author(s):  
X. Ma ◽  
G. Li ◽  
Y. Wang ◽  
H. Li ◽  
W. Yang
Keyword(s):  
L2 Norm ◽  

Author(s):  
S. G. Rajeev

The initial value problem of the incompressible Navier–Stokes equations is explained. Leray’s classic study of it (using Picard iteration) is simplified and described in the language of physics. The ideas of Lebesgue and Sobolev norms are explained. The L2 norm being the energy, cannot increase. This gives sufficient control to establish existence, regularity and uniqueness in two-dimensional flow. The L3 norm is not guaranteed to decrease, so this strategy fails in three dimensions. Leray’s proof of regularity for a finite time is outlined. His attempts to construct a scale-invariant singular solution, and modern work showing this is impossible, are then explained. The physical consequences of a negative answer to the regularity of Navier–Stokes solutions are explained. This chapter is meant as an introduction, for physicists, to a difficult field of analysis.


GigaScience ◽  
2020 ◽  
Vol 9 (12) ◽  
Author(s):  
Ariel Rokem ◽  
Kendrick Kay

Abstract Background Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used to select the best α from a set of candidates. However, efficient and appropriate selection of α can be challenging. This becomes prohibitive when large amounts of data are analyzed. Because the selected α depends on the scale of the data and correlations across predictors, it is also not straightforwardly interpretable. Results The present work addresses these challenges through a novel approach to ridge regression. We propose to reparameterize ridge regression in terms of the ratio γ between the L2-norms of the regularized and unregularized coefficients. We provide an algorithm that efficiently implements this approach, called fractional ridge regression, as well as open-source software implementations in Python and matlab (https://github.com/nrdg/fracridge). We show that the proposed method is fast and scalable for large-scale data problems. In brain imaging data, we demonstrate that this approach delivers results that are straightforward to interpret and compare across models and datasets. Conclusion Fractional ridge regression has several benefits: the solutions obtained for different γ are guaranteed to vary, guarding against wasted calculations; and automatically span the relevant range of regularization, avoiding the need for arduous manual exploration. These properties make fractional ridge regression particularly suitable for analysis of large complex datasets.


Sign in / Sign up

Export Citation Format

Share Document