Cyber Security: Of Heterogeneity and Autarky
35 Pages Posted: 15 Sep 2004
Date Written: August 2004
The wonder of the Internet is incredibly capable computers connected with each other under the control of individuals. For all of the reasons that we think that decentralization is a powerful force we have applauded the ability of individual users to set up websites and make their ideas available to others. But there is a dark side as well. Always - on connections, extra computing cycles and gigabytes of storage to burn mean that individual decisions can propagate throughout the network quickly. The small-worlds phenomenon that is the Internet means that my computer is only a handful of clicks away from a malicious computer programmer.
My decisions matter for your computing life. A malicious hacker can turn my computer into a zombie and use my broad-band connection and my computer to shut down websites, to send millions of spam emails, or worse. The network is a sea of computing externalities, many extraordinarily positive but others that can range from everyday bothersome to enormously disruptive. And, in the hands of a cyber-terrorist, the more we embed critical infrastructure into the public network, the more we make it possible for a cyber-terrorist to turns our computing resources against us and thereby harm critical infrastructure, such as the electricity grid or our communications networks.
Addressing cyber security is a mixed question of engineering - computing architecture - and legal rules. The zombie PC problem emerges with the rise of the Internet and decentralized control over PCs. The pricing structure of the Internet world-one-price, all-you-can-eat broadband and lumpy computing power in the form of powerful CPUs kills off many of the natural incentives for an individual to ensure that her computing resources are not being used by others. This can be good, as it creates many opportunities for sharing, but the downside is that there is little reason for the individual computer user to police against zombification.
In this article, I consider two issues in detail. The monoculture argument is one approach to architecting the network. That argument suggests that we should focus on forcing heterogeneity in operating systems to enhance our cyber security. I think that is the wrong emphasis. On its own terms, the argument tells us little about the extent of diversity that would be required to achieve meaningful protection, especially if our concern is the cyber-terrorist. The argument also ignores the more important question of adaptability, meaning how quickly can the current system adapt to new conditions. Instead, I argue in favor of the traditional approach of isolation - autarky - in separating critical infrastructure from the public network.
Second, I consider the way in which liability rules for software might influence the quality of software and software use decisions. Hackers can exploit defects in software to seize control of machines. Fewer defects to exploit and we might reduce the harms of hackers. This turns out to be tricky. Broad liability rules that would protect consumers from the harms of hacking will lead to the standard moral hazard problem that we see in insurance. Consumers who shouldn't be using computers or on the network will jump on once they are protected from hacking losses.
These are standard products liability issues, but software has two particular features that suggest that we should not just apply our standard approaches to products liability. First, we learn about software through use. One piece of software is combined with other software in a way that a Coke bottle is rarely combined with anything else. Second, software can adapt and can be fixed in place after-the-fact. Both of these features should push towards earlier release of software, for buggy software to be fixed later.
Keywords: cyber-security, monoculture, network externalities, computer software, product liability rules
Suggested Citation: Suggested Citation