Least Tail-Trimmed Squares for Infinite Variance Autoregressions
26 Pages Posted: 15 May 2012
Date Written: May 14, 2012
Abstract
We develop a robust least squares estimator for autoregressions with possibly heavy tailed errors. Robustness to heavy tails is ensured by negligibly trimming the squared error according to extreme values of the error and regressor. Tail-trimming ensures asymptotic normality and super-root(n)-convergence with a rate comparable to the highest achieved amongst M-estimators for stationary data. Moreover, tail-trimming ensures robustness to heavy tails in both small and large samples. By comparison, existing robust estimators are not as robust in small samples and have a slower rate of convergence when the variance is infinite, or are not asymptotically normal. We present a consistent estimator of the covariance matrix and treat classic inference without knowledge of the rate of convergence. A simulation study demonstrates the sharpness and approximate normality of the estimator, and we apply the estimator to financial returns data.
Suggested Citation: Suggested Citation
Do you have a job opening that you would like to promote on SSRN?
Recommended Papers
-
Moment Condition Tests for Heavy-Tailed Time Series
By Jonathan B. Hill and Mike Aguilar
-
Heavy-Tail and Plug-In Robust Consistent Conditional Moment Tests of Functional Form
-
Robust Estimation and Inference for Heavy Tailed Nonlinear GARCH
-
Robust Score and Portmanteau Tests of Volatility Spillover
By Mike Aguilar and Jonathan B. Hill
-
Measuring Tail Thickness Under GARCH and an Application to Extreme Exchange Rate Changes
By Terry Marsh and Niklas Wagner