This is a concise introduction to stochastic optimal control theory. We assume that the readers have basic knowledge of real analysis, functional analysis, elementary probability, ordinary differential equations and partial differential equations. We will present the following topics: (i) A brief presentation of relevant results on stochastic analysis; (ii) Formulation of stochastic optimal control problems; (iii) Variational method and Pontryagin’s maximum principle, together with a brief introduction of backward stochastic differential equations; (iv) Dynamic programming method and viscosity solutions to Hamilton-Jacobi-Bellman equation; (v) Linear-quadratic optimal control problems, including a careful discussion on open-loop optimal controls and closed-loop optimal strategies, linear forward-backward stochastic differential equations, and Riccati equations.
CITATION STYLE
Yong, J. (2022). STOCHASTIC OPTIMAL CONTROL — A CONCISE INTRODUCTION. Mathematical Control and Related Fields, 12(4), 1039–1136. https://doi.org/10.3934/mcrf.2020027
Mendeley helps you to discover research relevant for your work.