Comparing Correlation Matrix Estimators Via Kullback-Leibler Divergence
20 Pages Posted: 2 Dec 2011
Date Written: November 30, 2011
Abstract
We use a self-averaging measure called Kullback-Leibler divergence to evaluate the performance of four different correlation estimators: Fourier, Pearson, Maximum Likelihood and Hayashi-Yoshida estimator. The study uses simulated transaction prices for a large number of stocks and different data generating mechanisms, including synchronous and non-synchronous transactions, homogeneous and heterogeneous inter-transaction time. Different distributions of stock returns, i.e. multivariate Normal and multivariate Student's t-distribution, are also considered. We show that Fourier and Pearson estimators are equivalent proxies of the `true' correlation matrix within all the settings under analysis, and that both methods are outperformed by the Maximum Likelihood estimator when prices are synchronously sampled and price fluctuations follow a multivariate Student's t-distribution. Finally, we suggest to solve the singularity problem affecting the Hayashi-Yoshida estimator by shrinking the correlation matrix towards either Pearson or Fourier matrices, and provide evidence that the resulting combination leads to an improved estimator with respect to its single components.
Keywords: Correlation estimation, Pearson estimator, Fourier estimator, Hayashi-Yoshida estimator, Kullback-Leibler divergence
JEL Classification: C13, G19
Suggested Citation: Suggested Citation