Maximum Entropy and Information Theory Approaches to Economics
22 Pages Posted: 4 Jan 2018 Last revised: 24 Jan 2018
Date Written: December 30, 2017
Abstract
In the natural sciences, complex non-linear systems composed of large numbers of smaller subunits provide an opportunity to apply the tools of statistical mechanics and information theory. The principle of maximum entropy can usually provide shortcuts in the treatment of these complex systems. However, there is an impasse to straightforward application to social and economic systems: the lack of well-defined constraints for Lagrange multipliers. This is typically treated in economics by introducing marginal utility as a Lagrange multiplier.
Jumping off from Gary Becker’s 1962 paper "Irrational Behavior and Economic Theory" — a maximum entropy argument in disguise — we introduce Peter Fielitz and Guenter Borchardt’s concept of "information equilibrium" presented in arXiv:0905.0610v4 [physics.gen-ph] as a means of applying maximum entropy methods even in cases where well-defined constraints such as energy conservation required to define Lagrange multipliers and partition functions are not obvious (i.e. economics). From these initial steps we are able to motivate a well-defined constraint in terms of growth rates and develop a formalism for ensembles of markets described by information equilibrium conditions. We apply information equilibrium to a description of the US unemployment rate, connect it to search and matching theory, and empirical regularities such as Okun’s Law. This represents a step toward Lee Smolin’s call for a "statistical economics" analogous to statistical mechanics in arXiv:0902.4274 [q-fin.GN].
Keywords: Information theory, macroeconomics, microeconomics
JEL Classification: C00, E10, E30, E40
Suggested Citation: Suggested Citation