Riemann’s example of a continuous, ’nondifferentiable’ function

1978 ◽  
Vol 1 (1) ◽  
pp. 40-44 ◽  
Author(s):  
Erwin Neuenschwander
1970 ◽  
Vol 13 (1) ◽  
pp. 121-124 ◽  
Author(s):  
J. H. W. Burry ◽  
H. W. Ellis

In [1] it was shown that a continuous function of bounded variation on the real line determined a Method II outer measure for which the Borel sets were measurable and the measure of an open interval was equal to the total variation of f over the interval. The monotone property of measures implied that if an open interval I on which f was not of bounded variation contained subintervals on which f was of finite but arbitrarily large total variation then the measure of I was infinite. Since there are continuous functions that are not of bounded variation over any interval (e.g. the Weierstrasse nondifferentiable function) the general case was not resolved.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Abdellatif Moudafi ◽  
Paul-Emile Mainge

<p style='text-indent:20px;'>Based on a work by M. Dur and J.-B. Hiriart-Urruty[<xref ref-type="bibr" rid="b3">3</xref>], we consider the problem of whether a symmetric matrix is copositive formulated as a difference of convex functions problem. The convex nondifferentiable function in this d.c. decomposition being proximable, we then apply a proximal-gradient method to approximate the related stationary points. Whereas, in [<xref ref-type="bibr" rid="b3">3</xref>], the DCA algorithm was used.</p>


2013 ◽  
Vol 23 (1) ◽  
pp. 59-71
Author(s):  
Nada Djuranovic-Milicic ◽  
Milanka Gardasevic-Filipovic

In this paper an algorithm for minimization of a nondifferentiable function is presented. The algorithm uses the Moreau-Yosida regularization of the objective function and its second order Dini upper directional derivative. The purpose of the paper is to establish general hypotheses for this algorithm, under which convergence occurs to optimal points. A convergence proof is given, as well as an estimate of the rate of the convergence.


2009 ◽  
Vol 19 (2) ◽  
pp. 249-262 ◽  
Author(s):  
Milanka Gardasevic-Filipovic

The minimization of a particular nondifferentiable function is considered. The first and second order necessary conditions are given. A trust region method for minimization of this form of the objective function is presented. The algorithm uses the subgradient instead of the gradient. It is proved that the sequence of points generated by the algorithm has an accumulation point which satisfies the first and second order necessary conditions.


Sign in / Sign up

Export Citation Format

Share Document