Empirical Risk Minimization for Time Series: Nonparametric Performance Bounds for Prediction
53 Pages Posted: 13 Aug 2021 Last revised: 13 Apr 2023
Date Written: August 6, 2021
Empirical risk minimization is a standard principle for choosing algorithms in learning theory. In this paper we study the properties of empirical risk minimization for time series. The analysis is carried out in a general framework that covers different types of forecasting applications encountered in the literature. We are concerned with 1-step-ahead prediction of a univariate time series belonging to a class of location-scale parameter-driven processes. A class of recursive algorithms is available to forecast the time series. The algorithms are recursive in the sense that the forecast produced in a given period is a function of the lagged values of the forecast and of the time series. The relationship between the generating mechanism of the time series and the class of algorithms is not specified. Our main result establishes that the algorithm chosen by empirical risk minimization achieves asymptotically the optimal predictive performance that is attainable within the class of algorithms.
Keywords: Empirical risk minimization, oracle inequality, time series, fore- casting, Markov chain
JEL Classification: C14, C22, C53, C58
Suggested Citation: Suggested Citation