This week’s The New Republic features a cover story by Harvard Law School’s Jack Goldsmith on cyberwar. (June 24, 2010.) It’s a long, serious review essay, using Richard A. Clarke and Robert K. Knake’s new book, Cyber War, as the hook. But Jack goes well beyond a book review into the rapidly expanding literature on the subject – expanding across technical computer science and engineering, software, security, strategic, and legal lines. Terrifically well written and intelligent, I strongly recommend it (full disclosure: I haven’t read the book under review) – whether you know the field or are looking to get an overview of it. One thing is clear, it is not going away.
Years ago I decided my inner geek comparative advantage was in robotics, but I read this essay with particular attention to its discussion of complexity of systems, and just how hard it is to get a handle on cyber systems, and their diffuse, distributed natures:
Many factors make computer systems vulnerable, but the most fundamental factor is their extraordinary complexity. Most computers connected to the Internet are general-purpose machines designed to perform multiple tasks. The operating-system software that manages these tasks–as well as the computer’s relationship to the user–typically has tens of millions, and sometimes more than one hundred million, lines of operating instructions, or code. It is practically impossible to identify and to analyze all the different ways these lines of code can interact or might fail to operate as expected. And when the operating-system software interfaces with computer processors, various software applications, Web browsers, and the endless and endlessly complex pieces of hardware and software that constitute the computer and telecommunications networks that make up the Internet, the potential for unforeseen mistakes or failures becomes unfathomably large.
The complexity of computer systems often leads to accidental mistakes or failures. We have all suffered computer crashes, and sometimes these crashes cause serious problems. Last year the Internet in Germany and Sweden went down for several hours due to errors in the domain name system that identifies computers on the Internet. In January of this year, a software problem in the Pentagon’s global positioning system network prevented the Air Force from locking onto satellite signals on which they depend for many tasks. The accident on the Washington Metro last summer, which killed nine people and injured dozens, was probably caused by a malfunction in the computer system that controls train movements. Three years ago, six stealth F-22 Raptor jets on their maiden flights were barely able to return to base when their onboard computers crashed.
The same complexity that leads to such malfunctions also creates vulnerabilities that human agents can use to make computer systems operate in unintended ways. Such cyber threats come in two forms. A cyber attack is an act that alters, degrades, or destroys adversary computer systems or the information in or transiting through those systems. Cyber attacks are disruptive activities. Examples include the manipulation of a computer system to take over an electricity grid, or to block military communications, or to scramble or erase banking data. Cyber exploitations, by contrast, involve no disruption, but merely monitoring and related espionage on computer systems, as well as the copying of data that is on those systems. Examples include the theft of credit card information, trade secrets, health records, or weapons software, and the interception of vital business, military, and intelligence communications.
This drew my attention in part because of my interest in complexity and complex systems interacting one another in another part of my work – finance and financial regulation. Duke’s Steve Schwarcz and I are doing a book on financial regulation reform, and our approach – in a field currently getting saturated with books on this very topic – is to offer pragmatic, basic heuristics, rules of thumb, really, for how financial regulation needs to be designed. Not some super deep conceptualization, but something much more practical.
The same pragmatic assessment applies to diagnosing What Went Wrong, so to speak, in financial regulation. We have settled on the three homely, but still useful, categories of complexity, complacence, and conflicts (cupidity we take for granted). They’re useful because they’re homely. Complexity hides conflicts that undermine basic duties of loyalty, and breeds complacency that undermines basic duties of care, and they feed back into the development of more complexity. They stoke each other.
Professor Schwarcz has a Washington University Law Review paper on the issue of regulating complexity in finance and financial regulation, from which we are drawing for the book. I recommend it, partly for those interested in financial regulation issues and complexity – but I also recommend it as a way of thinking comparatively about complexity in other settings that cross-weave technological and legal-regulatory divides.