Takeuchi's Information Criteria as a Form of Regularization

26 Pages Posted: 16 Mar 2018 Last revised: 18 Apr 2018

See all articles by Matthew Francis Dixon

Matthew Francis Dixon

Illinois Institute of Technology

Tyler Ward

New York University (NYU)

Date Written: April 5, 2018


Takeuchi's Information Criteria (TIC) is a linearization of maximum likelihood estimator bias which shrinks the model parameters towards the maximum entropy distribution, even when the model is mis-specified. In statistical machine learning, $L_2$ regularization (a.k.a. ridge regression) also introduces a parameterized bias term with the goal of minimizing out-of-sample entropy, but generally requires a numerical solver to find the regularization parameter. This paper presents a novel regularization approach based on TIC; the approach does not assume a data generation process and results in a higher entropy distribution through more efficient sample noise suppression. The resulting objective function can be directly minimized to estimate and select the best model, without the need to select a regularization parameter, as in ridge regression. Numerical results applied to a synthetic high dimensional dataset generated from a logistic regression model demonstrate superior model performance when using the TIC based regularization over a $L_1$ and a $L_2$ penalty term.

Keywords: regularization, information criterion, statistical inference

Suggested Citation

Dixon, Matthew Francis and Ward, Tyler, Takeuchi's Information Criteria as a Form of Regularization (April 5, 2018). Available at SSRN: https://ssrn.com/abstract=3139945 or http://dx.doi.org/10.2139/ssrn.3139945

Matthew Francis Dixon (Contact Author)

Illinois Institute of Technology ( email )

Department of Math
W 32nd St., E1 room 208, 10 S Wabash Ave, Chicago,
Chicago, IL 60616
United States

Tyler Ward

New York University (NYU) ( email )

Bobst Library, E-resource Acquisitions
20 Cooper Square 3rd Floor
New York, NY 10003-711
United States

Register to save articles to
your library


Paper statistics

Abstract Views
PlumX Metrics