Subsampling Inference for the Autocovariances and Autocorrelations of Long‐Memory Heavy‐Tailed Linear Time Series
19 Pages Posted: 17 Oct 2012
Date Written: November 2012
We provide a self‐normalization for the sample autocovariances and autocorrelations of a linear, long‐memory time series with innovations that have either finite fourth moment or are heavy‐tailed with tail index 2 < α < 4. In the asymptotic distribution of the sample autocovariance there are three rates of convergence that depend on the interplay between the memory parameter d and α, and which consequently lead to three different limit distributions; for the sample autocorrelation the limit distribution only depends on d. We introduce a self‐normalized sample autocovariance statistic, which is computable without knowledge of α or d (or their relationship), and which converges to a non‐degenerate distribution. We also treat self‐normalization of the autocorrelations. The sampling distributions can then be approximated non‐parametrically by subsampling, as the corresponding asymptotic distribution is still parameter‐dependent. The subsampling‐based confidence intervals for the process autocovariances and autocorrelations are shown to have satisfactory empirical coverage rates in a simulation study. The impact of subsampling block size on the coverage is assessed. The methodology is further applied to the log‐squared returns of Merck stock.
Keywords: Linear time series, parameter‐dependent convergence rates, self‐normalization, subsampling confidence intervals
Suggested Citation: Suggested Citation