Some Decision Theoretic Generalizations of Information Measures
32 Pages Posted: 1 Nov 2005
Date Written: December 6, 2005
Abstract
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of these generalized quantities. We then consider these generalized quantities in an easily interpreted special case. We show that the resulting quantities, share many of the properties of entropy and relative entropy, such as the data processing inequality and the second law of thermodynamics. We formulate an important statistical learning problem - probability estimation - in terms of a generalized relative entropy. The solution of this problem reflects general risk preferences via the utility function; moreover, the solution is optimal in a sense of robust absolute performance.
Keywords: Generalized Entropy, Generalized Kullback-Leibler Relative Entropy, Decision Theory, Expected Utility, Horse Race, Tsallis Entropy, Statistical Learning, Probability Estimation, Risk Neutral Pricing Measure
Suggested Citation: Suggested Citation