Variational methods in problems of control and programming

Citations of this article
Mendeley users who have this article in their library.


It is shown how a fairly general control problem, or programming problem, with constraints can be reduced to a special type of classical Bolza problem in the calculus of variations. Necessary conditions from the Bolza problem are translated into necessary conditions for optimal control. It is seen from these conditions that Pontryagin's Maximum Principle is a translation of the usual Weierstrass condition, and is applicable to a wider class of problems than that considered by Pontryagin. The differentiability and continuity properties of the value of the control are established under reasonable hypotheses on the synthesis, and it is shown that the value satisfies the Hamilton-Jacobi equation. As a corollary we obtain a rigorous proof of a functional equation of Bellman that is valid for a much wider class of problems than heretofore. A sufficiency theorem for the synthesis of control is also given. © 1961.




Berkovitz, L. D. (1961). Variational methods in problems of control and programming. Journal of Mathematical Analysis and Applications, 3(1), 145–169.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free