Scaling and Measurement Error Sensitivity of Scoring Rules for Distribution Forecasts
50 Pages Posted: 8 Nov 2019 Last revised: 3 Mar 2023
Date Written: March 29, 2022
Abstract
I examine the sensitivity of scoring rules for distribution forecasts in two dimensions: sensitivity to linear rescaling of the data and the influence of measurement error on the forecast evaluation outcome. First, I show that all commonly used scoring rules for distribution forecasts are robust to rescaling the data. Second, it is revealed that the forecast ranking based on the continuous ranked probability score is less sensitive to gross measurement error than the ranking based on the log score. The theoretical results are complemented by a simulation study aligned with frequently revised quarterly US GDP growth data, a simulation study aligned with financial market volatility, and an empirical application forecasting realized variances of S&P 100 constituents.
Keywords: Forecast evaluation, measurement error, distribution forecasts, proper scoring rules
JEL Classification: C50, C52, C53
Suggested Citation: Suggested Citation