A New Approach to Stable Optimal Control of Complex Nonlinear Dynamical Systems
This paper gives a simple approach to designing a controller that minimizes a user-specified control cost for a mechanical system while ensuring that the control is stable. For a user-given Lyapunov function, the method ensures that its time rate of change is negative and equals a user specified negative definite function. Thus a closed-form, optimal, nonlinear controller is obtained that minimizes a desired control cost at each instant of time and is guaranteed to be Lyapunov stable. The complete nonlinear dynamical system is handled with no approximations/linearizations, and no a priori structure is imposed on the nature of the controller. The methodology is developed here for systems modeled by second-order, nonautonomous, nonlinear, differential equations. The approach relies on some recent fundamental results in analytical dynamics and uses ideas from the theory of constrained motion.