smooth classification
Recently Published Documents


TOTAL DOCUMENTS

15
(FIVE YEARS 2)

H-INDEX

5
(FIVE YEARS 0)

Author(s):  
D. Crowley ◽  
A. Skopenkov

We work in the smooth category. Let $N$ be a closed connected orientable 4-manifold with torsion free $H_1$ , where $H_q := H_q(N; {\mathbb Z} )$ . Our main result is a readily calculable classification of embeddings $N \to {\mathbb R}^7$ up to isotopy, with an indeterminacy. Such a classification was only known before for $H_1=0$ by our earlier work from 2008. Our classification is complete when $H_2=0$ or when the signature of $N$ is divisible neither by 64 nor by 9. The group of knots $S^4\to {\mathbb R}^7$ acts on the set of embeddings $N\to {\mathbb R}^7$ up to isotopy by embedded connected sum. In Part I we classified the quotient of this action. The main novelty of this paper is the description of this action for $H_1 \ne 0$ , with an indeterminacy. Besides the invariants of Part I, detecting the action of knots involves a refinement of the Kreck invariant from our work of 2008. For $N=S^1\times S^3$ we give a geometrically defined 1–1 correspondence between the set of isotopy classes of embeddings and a certain explicitly defined quotient of the set ${\mathbb Z} \oplus {\mathbb Z} \oplus {\mathbb Z} _{12}$ .


2017 ◽  
Vol 11 (1) ◽  
pp. 263-312
Author(s):  
Ralf Spatzier ◽  
◽  
Lei Yang ◽  

2016 ◽  
Vol 28 (6) ◽  
pp. 1217-1247 ◽  
Author(s):  
Yunlong Feng ◽  
Yuning Yang ◽  
Xiaolin Huang ◽  
Siamak Mehrkanoon ◽  
Johan A. K. Suykens

This letter addresses the robustness problem when learning a large margin classifier in the presence of label noise. In our study, we achieve this purpose by proposing robustified large margin support vector machines. The robustness of the proposed robust support vector classifiers (RSVC), which is interpreted from a weighted viewpoint in this work, is due to the use of nonconvex classification losses. Besides the robustness, we also show that the proposed RSCV is simultaneously smooth, which again benefits from using smooth classification losses. The idea of proposing RSVC comes from M-estimation in statistics since the proposed robust and smooth classification losses can be taken as one-sided cost functions in robust statistics. Its Fisher consistency property and generalization ability are also investigated. Besides the robustness and smoothness, another nice property of RSVC lies in the fact that its solution can be obtained by solving weighted squared hinge loss–based support vector machine problems iteratively. We further show that in each iteration, it is a quadratic programming problem in its dual space and can be solved by using state-of-the-art methods. We thus propose an iteratively reweighted type algorithm and provide a constructive proof of its convergence to a stationary point. Effectiveness of the proposed classifiers is verified on both artificial and real data sets.


2013 ◽  
Vol 23 (06) ◽  
pp. 1350100
Author(s):  
ZHIHUA REN ◽  
JIAZHONG YANG ◽  
FUWANG YU

In this note, we study smooth classification of all germs of 0-resonant diffeomorphisms on ℝ3 having generic nonlinear parts. We prove that, for the Poincaré type diffeomorphisms, except for one germ, any two such germs are at least C3 conjugated if and only if their linear parts are similar; and for the non-Poincaré ones, any such two germs can be C∞ conjugated provided that they have similar linear parts.


10.1167/5.9.1 ◽  
2005 ◽  
Vol 5 (9) ◽  
pp. 1 ◽  
Author(s):  
Alan Chauvin ◽  
Keith J. Worsley ◽  
Philippe G. Schyns ◽  
Martin Arguin ◽  
Frédéric Gosselin

Sign in / Sign up

Export Citation Format

Share Document