Forecasting Methods and Principles: Evidence-Based Checklists

Journal of Global Scholars of Marketing Science, Forthcoming

29 Pages Posted: 9 Aug 2018

See all articles by J. Scott Armstrong

J. Scott Armstrong

University of Pennsylvania - Marketing Department

Kesten C. Green

University of South Australia - UniSA Business; Ehrenberg-Bass Institute

Multiple version iconThere are 2 versions of this paper

Date Written: January 6, 2018


Problem: How to help practitioners, academics, and decision makers use experimental research findings to substantially reduce forecast errors for all types of forecasting problems.
Methods: Findings from our review of forecasting experiments were used to identify methods and principles that lead to accurate forecasts. Cited authors were contacted to verify that summaries of their research were correct. Checklists to help forecasters and their clients practice and commission studies that adhere to principles and use valid methods were developed. Leading researchers were asked to identify errors of omission or commission in the analyses and summaries of research findings.
Findings: Forecast accuracy can be improved by using one of 15 relatively simple evidence-based forecasting methods. One of those methods, knowledge models, provides substantial improvements in accuracy when causal knowledge is good. On the other hand, data models—developed using multiple regression, data mining, neural nets, and “big data analytics”—are unsuited for forecasting.
Originality: Three new checklists for choosing validated methods, developing knowledge models, and assessing uncertainty are presented. A fourth checklist, based on the Golden Rule of Forecasting, was improved.
Usefulness: Combining forecasts within individual methods and across different methods can reduce forecast errors by as much as 50%. Forecasts errors from currently used methods can be reduced by increasing their compliance with the principles of conservatism (Golden Rule of Forecasting) and simplicity (Occam’s Razor). Clients and other interested parties can use the checklists to determine whether forecasts were derived using evidence-based procedures and can, therefore, be trusted for making decisions. Scientists can use the checklists to devise tests of the predictive validity of their findings.

Keywords: combining forecasts, data models, decomposition, equalizing, expectations, extrapolation, knowledge models, intentions, Occam’s razor, prediction intervals, predictive validity, regression analysis, uncertainty

JEL Classification: C53, C54, C55, C63, C7, H68, J11, Q11, Q21, Q31, Q47, Q54, R2, R3, R4

Suggested Citation

Armstrong, J. Scott and Green, Kesten C., Forecasting Methods and Principles: Evidence-Based Checklists (January 6, 2018). Journal of Global Scholars of Marketing Science, Forthcoming, Available at SSRN:

J. Scott Armstrong

University of Pennsylvania - Marketing Department ( email )

700 Jon M. Huntsman Hall
3730 Walnut Street
Philadelphia, PA 19104-6340
United States
215-898-5087 (Phone)
215-898-2534 (Fax)


Kesten C. Green (Contact Author)

University of South Australia - UniSA Business ( email )

GPO Box 2471
Adelaide, SA 5001
+61 8 83012 9097 (Phone)


Ehrenberg-Bass Institute ( email )



Do you have negative results from your research you’d like to share?

Paper statistics

Abstract Views
PlumX Metrics