Researchers’ Data Analysis Choices: An Excess of False Positives?
29 Pages Posted: 21 Dec 2017 Last revised: 30 Jul 2018
Date Written: July 6, 2018
The paper assesses the quality of current research due to standard data analyses conventions. It focuses on a likely failure: a pervasiveness of false null-rejections. The argument goes beyond criticizing the (typically accepted) practice of researchers’ publishing only those results that support the desired findings. In addition, this selection bias interacts negatively with a reliance on powerful statistical methods, which pose their own problems. The potential for an abundance of false positives (“type I errors”) becomes apparent. The paper notes that supplementary data analyses can readily counter this problem via use of statistical methods making null-rejections less likely. Yet publication conventions do not require reports on such tests, and this absence impairs attempts to assess findings’ intrinsic robustness. To rectify these shortcomings requires a willingness to allow for equivocal findings: in an inherently complex world, comprehensive data analysis should oblige researchers to often acknowledge that “the evidence supporting our core hypothesis must be qualified because….” The paper discusses the nature of professional incentives which inspire researchers to claim the opposite, that is, researchers tend to present findings without a trace of qualifications – a recipe for a high incidence of false positives. The paper accordingly argues that over the long run much of the current literature will be dismissed as at best dubious.
Keywords: research methodology
Suggested Citation: Suggested Citation