Logical Information Theory: New Logical Foundations for Information Theory
Ellerman, David. 2017. “Logical Information Theory: New Foundations for Information Theory.” Logic Journal of the IGPL 25 (5 Oct.): 806–35.
31 Pages Posted: 29 Nov 2017
Date Written: June 7, 2017
There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the (product) probability measure on the sets of distinctions. The compound notions of joint, conditional, and mutual entropies are obtained as the values of the measure, respectively, on the union, difference, and intersection of the sets of distinctions. These compound notions of logical entropy satisfy the usual Venn diagram relationships (e.g., inclusion-exclusion formulas) since they are values of a measure (in the sense of measure theory). The uniform transformation into the formulas for Shannon entropy is linear so it explains the long-noted fact that the Shannon formulas satisfy the Venn diagram relations - as an analogy or mnemonic - since Shannon entropy is not a measure (in the sense of measure theory) on a given set.
Keywords: logical entropy, Shannon entropy
JEL Classification: D8
Suggested Citation: Suggested Citation