scholarly journals Steepest-Descent Approach to Triple Hierarchical Constrained Optimization Problems

2014 ◽  
Vol 2014 ◽  
pp. 1-19
Author(s):  
Lu-Chuan Ceng ◽  
Cheng-Wen Liao ◽  
Chin-Tzong Pang ◽  
Ching-Feng Wen

We introduce and analyze a hybrid steepest-descent algorithm by combining Korpelevich’s extragradient method, the steepest-descent method, and the averaged mapping approach to the gradient-projection algorithm. It is proven that under appropriate assumptions, the proposed algorithm converges strongly to the unique solution of a triple hierarchical constrained optimization problem (THCOP) over the common fixed point set of finitely many nonexpansive mappings, with constraints of finitely many generalized mixed equilibrium problems (GMEPs), finitely many variational inclusions, and a convex minimization problem (CMP) in a real Hilbert space.

2015 ◽  
Vol 2015 ◽  
pp. 1-22
Author(s):  
L. C. Ceng ◽  
A. Latif ◽  
C. F. Wen ◽  
A. E. Al-Mazrooei

We introduce and analyze a relaxed iterative algorithm by combining Korpelevich’s extragradient method, hybrid steepest-descent method, and Mann’s iteration method. We prove that, under appropriate assumptions, the proposed algorithm converges strongly to a common element of the fixed point set of infinitely many nonexpansive mappings, the solution set of finitely many generalized mixed equilibrium problems (GMEPs), the solution set of finitely many variational inclusions, and the solution set of general system of variational inequalities (GSVI), which is just a unique solution of a triple hierarchical variational inequality (THVI) in a real Hilbert space. In addition, we also consider the application of the proposed algorithm for solving a hierarchical variational inequality problem with constraints of finitely many GMEPs, finitely many variational inclusions, and the GSVI. The results obtained in this paper improve and extend the corresponding results announced by many others.


Filomat ◽  
2020 ◽  
Vol 34 (5) ◽  
pp. 1557-1569
Author(s):  
Nguyen Buong ◽  
Nguyen Anh ◽  
Khuat Binh

In this paper, for finding a fixed point of a nonexpansive mapping in either uniformly smooth or reflexive and strictly convex Banach spaces with a uniformly G?teaux differentiable norm, we present a new explicit iterative method, based on a combination of the steepest-descent method with the Ishikawa iterative one. We also show its several particular cases one of which is the composite Halpern iterative method in literature. The explicit iterative method is also extended to the case of infinite family of nonexpansive mappings. Numerical experiments are given for illustration.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 72
Author(s):  
Siti Farhana Husin ◽  
Mustafa Mamat ◽  
Mohd Asrul Hery Ibrahim ◽  
Mohd Rivaie

In this paper, we develop a new search direction for Steepest Descent (SD) method by replacing previous search direction from Conjugate Gradient (CG) method, , with gradient from the previous step,  for solving large-scale optimization problem. We also used one of the conjugate coefficient as a coefficient for matrix . Under some reasonable assumptions, we prove that the proposed method with exact line search satisfies descent property and possesses the globally convergent. Further, the numerical results on some unconstrained optimization problem show that the proposed algorithm is promising. 


2014 ◽  
Vol 2014 ◽  
pp. 1-22
Author(s):  
Lu-Chuan Ceng ◽  
Cheng-Wen Liao ◽  
Chin-Tzong Pang ◽  
Ching-Feng Wen

We introduce and analyze a hybrid iterative algorithm by combining Korpelevich's extragradient method, the hybrid steepest-descent method, and the averaged mapping approach to the gradient-projection algorithm. It is proven that, under appropriate assumptions, the proposed algorithm converges strongly to a common element of the fixed point set of finitely many nonexpansive mappings, the solution set of a generalized mixed equilibrium problem (GMEP), the solution set of finitely many variational inclusions, and the solution set of a convex minimization problem (CMP), which is also a unique solution of a triple hierarchical variational inequality (THVI) in a real Hilbert space. In addition, we also consider the application of the proposed algorithm to solving a hierarchical variational inequality problem with constraints of the GMEP, the CMP, and finitely many variational inclusions.


Filomat ◽  
2019 ◽  
Vol 33 (14) ◽  
pp. 4403-4419
Author(s):  
Lu-Chuan Ceng ◽  
Jen-Chih Yao ◽  
Yonghong Yao

In this paper, we introduce and analyze a composite steepest-descent algorithm for solving the triple hierarchical variational inequality problem in a real Hilbert space. Under mild conditions, the strong convergence of the iteration sequences generated by the algorithm is established.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Nguyen Buong

<p style='text-indent:20px;'>In this paper, for solving the variational inequality problem over the set of common fixed points of a finite family of demiclosed quasi-nonexpansive mappings in Hilbert spaces, we propose two new strongly convergent methods, constructed by specific combinations between the steepest-descent method and the block-iterative ones. The strong convergence is proved without the boundedly regular assumptions on the family of fixed point sets as well as the approximately shrinking property for each mapping of the family, that are usually assumed in recent literature for similar problems. Applications to the multiple-operator split common fixed point problem (MOSCFPP) and the problem of common minimum points of a finite family of lower semi-continuous convex functions with numerical experiments are given.</p>


2021 ◽  
Vol 7 (1) ◽  
pp. 1-11
Author(s):  
Noureddine Rahali ◽  
Mohammed Belloufi ◽  
Rachid Benzine

AbstractAn accelerated of the steepest descent method for solving unconstrained optimization problems is presented. which propose a fundamentally different conjugate gradient method, in which the well-known parameter βk is computed by an new formula. Under common assumptions, by using a modified Wolfe line search, descent property and global convergence results were established for the new method. Experimental results provide evidence that our proposed method is in general superior to the classical steepest descent method and has a potential to significantly enhance the computational efficiency and robustness of the training process.


Sign in / Sign up

Export Citation Format

Share Document