STOCHASTIC OPTIMAL CONTROL — A CONCISE INTRODUCTION

9Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

This is a concise introduction to stochastic optimal control theory. We assume that the readers have basic knowledge of real analysis, functional analysis, elementary probability, ordinary differential equations and partial differential equations. We will present the following topics: (i) A brief presentation of relevant results on stochastic analysis; (ii) Formulation of stochastic optimal control problems; (iii) Variational method and Pontryagin’s maximum principle, together with a brief introduction of backward stochastic differential equations; (iv) Dynamic programming method and viscosity solutions to Hamilton-Jacobi-Bellman equation; (v) Linear-quadratic optimal control problems, including a careful discussion on open-loop optimal controls and closed-loop optimal strategies, linear forward-backward stochastic differential equations, and Riccati equations.

Cite

CITATION STYLE

APA

Yong, J. (2022). STOCHASTIC OPTIMAL CONTROL — A CONCISE INTRODUCTION. Mathematical Control and Related Fields, 12(4), 1039–1136. https://doi.org/10.3934/mcrf.2020027

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free