A Tutorial on Optimal Control Theory
INFOR, Vol. 19, No. 4, p. 279-291, November 1981
14 Pages Posted: 24 Jun 2009 Last revised: 22 Mar 2014
Date Written: November 1, 1981
Management science applications frequently involve problems of controlling continuous time dynamic systems, that is, systems which evolve over time. Optimal control theory, a relatively new branch of mathematics, determines the optimal way to control such a dynamic system. The purpose of this tutorial paper is to provide an elementary introduction to optimal control theory and to illustrate it by formulating a simple example. A reader who has covered this tutorial should be able to read most of this special issue of INFOR which contains articles applying optimal control theory to the solution of management science problems.
Keywords: Optimal control, dynamic systems, Maximum principle, economic interpretation
JEL Classification: C61
Suggested Citation: Suggested Citation