31 Pages Posted: 15 Nov 2005
Date Written: August 2005
Both criminal and regulatory law have traditionally been skeptical of what Jeremy Bentham referred to as evidentiary offenses - the prohibition (or regulation) of some activity not because it is wrong, but because it probabilistically (but not universally) indicates that a real wrong has occurred. From Bentham to the present, courts and theorists have worried about this form of regulation, believing that certainly in the criminal law context but even with respect to regulation it is wrong to impose sanctions on a "where there's smoke there's fire" theory of governmental intervention. Yet although this kind punishment by proxy continues to be held in disrepute, both in courts and in the literature, we argue that this distaste is unwarranted. Regulating - even through the criminal law - by regulating intrinsically innocent activities that probabilistically but not inexorably indicate not-so-innocent activities is no different from the vast number of other probabilistic elements that pervade the regulatory process. Once we recognize the frequency with which we accept probabilistic but not certain burdens of proof, probabilistic but not certain substantive rules, and probabilistic but not certain pieces of evidence, we can see that defining offenses and regulatory targets in terms of non-wrongful behavior that is evidence of wrongful behavior is neither surprising nor inadvisable.
Keywords: Business and Government Policy, Crime and Criminal Justice, Economics - Microeconomics, Environment and Natural Resources, Law and Legal Institutions, Regulation
JEL Classification: H00
Suggested Citation: Suggested Citation
Schauer, Frederick and Zeckhauser, Richard J., Regulation by Generalization (August 2005). AEI-Brookings Joint Center Working Paper No. 05-16; KSG Working Paper No. RWP05-048. Available at SSRN: https://ssrn.com/abstract=847849 or http://dx.doi.org/10.2139/ssrn.847849
By Steven Allen