On Some Extremal Problems in the Theory of Differential Equations with Applications to the Theory of Optimal Control

Author(s):  
R. V. Gamkrelidze
2011 ◽  
Vol 2 ◽  
pp. 50-53
Author(s):  
R. N. Yadav ◽  
S. K. Chakrabarti

Higher dimensional differential equations may express several real world simulation processes which depend upon their pre-history and subject to short-time disturbances. Such processes occur in the theory of optimal control, population dynamics, biotechnologies, economics, mathematical physics etc. So, the study of this class of dynamical systems is gradually gaining momentum. In the present work Avery-Peterson theorem has been envisaged for getting the positive periodic solutions of the corresponding differential equations for cones with impulses on time scales. By using the multiple fixed-point theorems we have shown through different lemmas and manipulation of several functions how the necessary criteria can be mathematically arrived at so that the results come to be feasible as well as effective.Keywords: Real world simulation processes; Theory of optimal control; Population dynamics; Biotechnologies; EconomicsThe Himalayan Physics Vol.2, No.2, May, 2011Page: 50-53Uploaded Date: 1 August, 2011


Games ◽  
2021 ◽  
Vol 12 (1) ◽  
pp. 23
Author(s):  
Alexander Arguchintsev ◽  
Vasilisa Poplevko

This paper deals with an optimal control problem for a linear system of first-order hyperbolic equations with a function on the right-hand side determined from controlled bilinear ordinary differential equations. These ordinary differential equations are linear with respect to state functions with controlled coefficients. Such problems arise in the simulation of some processes of chemical technology and population dynamics. Normally, general optimal control methods are used for these problems because of bilinear ordinary differential equations. In this paper, the problem is reduced to an optimal control problem for a system of ordinary differential equations. The reduction is based on non-classic exact increment formulas for the cost-functional. This treatment allows to use a number of efficient optimal control methods for the problem. An example illustrates the approach.


Author(s):  
Mohammad A. Kazemi

AbstractIn this paper a class of optimal control problems with distributed parameters is considered. The governing equations are nonlinear first order partial differential equations that arise in the study of heterogeneous reactors and control of chemical processes. The main focus of the present paper is the mathematical theory underlying the algorithm. A conditional gradient method is used to devise an algorithm for solving such optimal control problems. A formula for the Fréchet derivative of the objective function is obtained, and its properties are studied. A necessary condition for optimality in terms of the Fréchet derivative is presented, and then it is shown that any accumulation point of the sequence of admissible controls generated by the algorithm satisfies this necessary condition for optimality.


Sign in / Sign up

Export Citation Format

Share Document