General necessary conditions for optimal control of stochastic systems

Author(s):  
U. G. Haussmann
2012 ◽  
Vol 2012 ◽  
pp. 1-50 ◽  
Author(s):  
Jingtao Shi

This paper deals with the general optimal control problem for fully coupled forward-backward stochastic differential equations with random jumps (FBSDEJs). The control domain is not assumed to be convex, and the control variable appears in both diffusion and jump coefficients of the forward equation. Necessary conditions of Pontraygin's type for the optimal controls are derived by means of spike variation technique and Ekeland variational principle. A linear quadratic stochastic optimal control problem is discussed as an illustrating example.


1968 ◽  
Vol 8 (1) ◽  
pp. 114-118 ◽  
Author(s):  
A. W. J. Stoddart

In [4], Hanson has obtained necessary conditions and sufficient conditions for optimality of a program in stochastic systems. However, in many cases, especially in a general treatment, a program satisfying these conditions cannot be determined explicitly, so that the question of existence of an optimal program in such systems is significant. In this paper, we obtain conditions sufficient for existence of an optimal program by applying the direct methods of the calculus of variations [9], [6] and the theory of optimal control [7], [5].


1974 ◽  
Vol 19 (6) ◽  
pp. 1165-1175 ◽  
Author(s):  
EDGAR C. TACKER ◽  
THOMAS D. LINTON ◽  
CHARLES W. SANDERS

2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
Rui Zhang ◽  
Yinjing Guo ◽  
Xiangrong Wang ◽  
Xueqing Zhang

This paper extends the stochastic stability criteria of two measures to the mean stability and proves the stability criteria for a kind of stochastic Itô’s systems. Moreover, by applying optimal control approaches, the mean stability criteria in terms of two measures are also obtained for the stochastic systems with coefficient’s uncertainty.


Sign in / Sign up

Export Citation Format

Share Document