Scaling and Measurement Error Sensitivity of Scoring Rules for Distribution Forecasts
46 Pages Posted: 8 Nov 2019 Last revised: 30 Mar 2022
Date Written: March 29, 2022
I examine the sensitivity of scoring rules for distribution forecasts in two dimensions: sensitivity to linear rescaling of the data and the influence of measurement error on the forecast evaluation outcome. First, I show that all commonly used scoring rules for distribution forecasts are robust to rescaling the data. Second, it is revealed that the forecast ranking based on the continuous ranked probability score is less sensitive to gross measurement error than the ranking based on the log score. The theoretical results are complemented by a simulation study aligned with frequently revised quarterly US GDP growth data and an empirical application forecasting realized variances of S&P 100 constituents.
Keywords: Forecast evaluation, measurement error, distribution forecasts, proper scoring rules
JEL Classification: C50, C52, C53
Suggested Citation: Suggested Citation