Measuring and Adjusting for Overconfidence
Posted: 28 Jun 2012
Date Written: June 27, 2012
To evaluate density forecasts the applied scoring rule is often arbitrarily chosen. The selection of the scoring rule strongly influences the ranking of forecasts. This paper identifies overconfidence as the main driver for scoring differences. A novel approach to measure overconfidence is proposed. Based on a non-proper scoring rule the forecasts can be individually adjusted towards a calibrated forecast. Applying the adjustment procedure to the Survey of Professional Forecasters it can be shown that out-of-sample forecasts can be significantly improved. Also the ranking of the adjusted forecasts becomes less sensitive to the selection of scoring rules.
Keywords: belief measurement, proper scoring rules, overconfidence, probability adjustment
JEL Classification: C53, E37, D81
Suggested Citation: Suggested Citation