Random Projection Estimation of Discrete-Choice Models with Large Choice Sets
34 Pages Posted: 17 Mar 2016 Last revised: 11 Aug 2017
Date Written: May 26, 2016
We introduce sparse random projection, an important tool from machine learning, for the estimation of discrete-choice models with high-dimensional choice sets. First, the high-dimensional data are compressed into a lower-dimensional Euclidean space using random projections. In the second step, estimation proceeds using the cyclic monotonicity inequalities implied by the multinomial choice model; the estimation procedure is semi-parametric and does not require explicit distributional assumptions to be made regarding the random utility errors. The random projection procedure is justified via the Johnson-Lindenstrauss Lemma: – the pairwise distances between data points are preserved during data compression, which we exploit to show convergence of our estimator. The estimator works well in computational simulation and in a application to a real-world supermarket scanner dataset.
Keywords: semiparametric multinomial choice models, random projection, large choice sets, cyclic monotonicity, Johnson-Lindenstrauss Lemma
Suggested Citation: Suggested Citation