Regulation by Generalization
University of Virginia School of Law
Richard J. Zeckhauser
Harvard University - Harvard Kennedy School (HKS); National Bureau of Economic Research (NBER)
AEI-Brookings Joint Center Working Paper No. 05-16
KSG Working Paper No. RWP05-048
Both criminal and regulatory law have traditionally been skeptical of what Jeremy Bentham referred to as evidentiary offenses - the prohibition (or regulation) of some activity not because it is wrong, but because it probabilistically (but not universally) indicates that a real wrong has occurred. From Bentham to the present, courts and theorists have worried about this form of regulation, believing that certainly in the criminal law context but even with respect to regulation it is wrong to impose sanctions on a "where there's smoke there's fire" theory of governmental intervention. Yet although this kind punishment by proxy continues to be held in disrepute, both in courts and in the literature, we argue that this distaste is unwarranted. Regulating - even through the criminal law - by regulating intrinsically innocent activities that probabilistically but not inexorably indicate not-so-innocent activities is no different from the vast number of other probabilistic elements that pervade the regulatory process. Once we recognize the frequency with which we accept probabilistic but not certain burdens of proof, probabilistic but not certain substantive rules, and probabilistic but not certain pieces of evidence, we can see that defining offenses and regulatory targets in terms of non-wrongful behavior that is evidence of wrongful behavior is neither surprising nor inadvisable.
Number of Pages in PDF File: 31
Keywords: Business and Government Policy, Crime and Criminal Justice, Economics - Microeconomics, Environment and Natural Resources, Law and Legal Institutions, Regulation
JEL Classification: H00
Date posted: November 15, 2005
© 2015 Social Science Electronic Publishing, Inc. All Rights Reserved.
This page was processed by apollo1 in 0.360 seconds