|
||||
|
||||
Ensembles of Overfit and Overconfident ForecastsYael Grushka-CockayneUniversity of Virginia - Darden School of Business Victor Richmond R. JoseGeorgetown University - McDonough School of Business Kenneth C. Lichtendahl Jr.University of Virginia - Darden School of Business August 18, 2015 Darden Business School Working Paper No. 2474438 Abstract: Firms today average forecasts collected from multiple experts and models. Because of cognitive biases, strategic incentives, or the structure of machine-learning algorithms, these forecasts are often overfit to sample data and are overconfident. Little is known about the challenges associated with aggregating such forecasts. We introduce a theoretical model to examine the combined effect of overfitting and overconfidence on the average forecast. Their combined effect is that the mean and median probability forecasts are poorly calibrated with hit rates of their prediction intervals too high and too low, respectively. Consequently, we prescribe the use of a trimmed average, or trimmed opinion pool, to achieve better calibration. We identify the random forest, a leading machine-learning algorithm that pools hundreds of overfit and overconfident regression trees, as an ideal environment for trimming probabilities. Using several known datasets, we demonstrate that trimmed ensembles can significantly improve the random forest's predictive accuracy.
Number of Pages in PDF File: 37 Keywords: wisdom of crowds; base-rate neglect; linear opinion pool; trimmed opinion pool; hit rate; calibration; random forest. JEL Classification: C10, C53, E17 Date posted: August 3, 2014 ; Last revised: August 19, 2015Suggested CitationContact Information
|
|
|||||||||||||||||||
© 2015 Social Science Electronic Publishing, Inc. All Rights Reserved.
FAQ
Terms of Use
Privacy Policy
Copyright
Contact Us
This page was processed by apollo2 in 1.329 seconds