In this paper we formulate a time-optimal control problem in the space of probability measures endowed with the Wasserstein metric as a natural generalization of the correspondent classical problem in Rd where the controlled dynamics is given by a differential inclusion. The main motivation is to model situations in which we have only a probabilistic knowledge of the initial state. In particular we prove first a Dynamic Programming Principle and then we give an Hamilton-Jacobi- Bellman equation in the space of probability measures which is solved by a generalization of the minimum time function in a suitable viscosity sense.
CITATION STYLE
Cavagnari, G., Marigonda, A., & Orlandi, G. (2016). Hamilton-Jacobi-Bellman equation for a time-optimal control problem in the space of probability measures. In IFIP Advances in Information and Communication Technology (Vol. 494, pp. 200–208). Springer New York LLC. https://doi.org/10.1007/978-3-319-55795-3_18
Mendeley helps you to discover research relevant for your work.