On Time Inconsistent Stochastic Control in Continuous Time
32 Pages Posted: 27 Dec 2016
Date Written: December 26, 2016
Abstract
In this paper, which is a continuation of a previously published discrete time paper, we study a class of continuous time stochastic control problems which, in various ways, are time inconsistent in the sense that they do not admit a Bellman optimality principle. We study these problems within a game theoretic framework, and we look for Nash subgame perfect equilibrium points. For a general controlled continuous time Markov process and a fairly general objective functional we derive an extension of the standard Hamilton-Jacobi-Bellman equation, in the form of a system of non-linear equations, for the determination for the equilibrium strategy as well as the equilibrium value function. The main theoretical result is a verification Theorem. As an application of the general theory we study a time inconsistent linear quadratic regulator. We also present a study of time inconsistency within the framework of a general equilibrium production economy of Cox-Ingersoll-Ross type.
Keywords: Time consistency, time inconsistency, time inconsistent control, dynamic programming
JEL Classification: C61, C72, C73, D5, G11, G12
Suggested Citation: Suggested Citation