Cyberwar and Industrial Controls: A conversation with Ralph Langner

I got a call the other day from Ralph Langner, the man who reverse-engineered Stuxnet. He wanted to compare views on cyberweapons and industrial control systems.

These systems have now been installed in most of the infrastructure that supports civilian life. But Stuxnet showed how vulnerable such systems can be to cyberattack. I fear that civilians will be targeted by cyberweapons in future conflicts – and that we have no good response to that threat.
Cyberattack

Langner, a German national whose consulting firm remediates infrastructure risks, knows a lot about these issues. Here is an edited version of a conversation that left me both disturbed and, oddly, optimistic.

BAKER: After Stuxnet, am I right to worry that we could see a cyberattack on industrial control systems with the kind of massive civilian damage that will look and feel like war?

LANGNER: There’s no doubt that cyberattacks on industrial control systems are possible. And that they could cause large-scale disasters. It’s just not that hard, and definitely does not require the resources of a nation state, to cause major physical damage by changing the code being executed on an industrial control system.

BAKER: Like what?

LANGNER: The big difference between IT and control systems is that manipulating the latter doesn’t simply change data, but physical reality – such as destroying equipment, maliciously changing products (think about food and beverages for example), or causing explosions. And those changes cannot be fixed by simply restoring from backup tapes.

It’s not just the loss of infrastructure we need to worry about.  Chemical plants run on industrial control systems; they could be remotely instructed to release gases that will kill the people in surrounding neighborhoods in a Cyber Bhopal scenario. That’s a huge problem because there are several thousand potential chemical targets in the US alone.

BAKER: Even if the industrial control system is fully separated from the Internet?

LANGNER: Sure. You can cause the damage using rogue code which is infiltrated via contractors’ BYOD [“bring your own device” –- a recent IT trend allowing employees to use their own equipment on corporate networks – SB] to jump the air gap, as in the Stuxnet scenario. Such rogue code operates autonomously once it’s released into the system.

BAKER: A kind of “fire and forget” weapon, like a cruise missile?

LANGNER: Right. And once it reaches the target, it can cause disaster within minutes. Stuxnet again showed the way. Its authors were never in direct communication with the malware’s payload after it infected the Natanz enrichment facility. They didn’t need to be. The rogue code did all the damage on its own.

BAKER: So, are widespread civilian infrastructure disasters inevitable in future conflicts? Are we just screwed?

LANGNER: Only if we are stupid. There’s a lot we can do to prevent those disasters. The technical problem is solvable. The real problem is political will.

BAKER: That’s weird. It feels as though all we do these days is talk about cyberwar. You’re telling me that, despite all the attention the problem is getting, we’re actually not doing things that would protect civilians?

LANGNER: I think you said it: “All we do is talk about cyberwar.” All talk, no action – at least when it comes to system protection. When it comes to offensive cyber capabilities, it’s more like the opposite: Little talk, much action, juicy funding. Look, we can do this.  Industrial control systems are not like the IT networks we’ve been trying unsuccessfully to secure for the last twenty years. They’re a lot simpler. The heart of any system is a group of programmable logic controllers, or PLCs, that control an industrial process, saying “Open that valve” or “Shut down that  component.”  PLCs may run code, and they may be networked, but they aren’t yet bloated with features and complexity. They’re about as sophisticated as an Apple II.

BAKER: Hey, that was my first computer! It sure felt sophisticated at the time.

LANGNER: Still, it’s a lot easier to secure. PLCs don’t multitask, and the “attack surface” is a lot more limited than a current PC. What’s more, we’re not talking about some massive network. You can run an entire electric power plant with maybe 200 PLCs. Once it’s installed and running, the configuration is never changed. You don’t have users insisting that new applications or devices be hooked up every month. Understanding and securing a small, static network with limited computing power is not beyond the capacity of mere mortals.

BAKER: Yet you say we’re not doing it. Why not?

LANGNER: I think the managers of infrastructure companies are kidding themselves. They’re trying to apply “risk management” tools to the problem, and that means they aren’t actually spending much on solutions.

BAKER: Hey, wait a minute. I believe in risk management. If serving at DHS taught me anything, it’s that you can’t eliminate risk; you have to manage it. And you shouldn’t spend more to avoid a disaster than the disaster would cost. Isn’t risk management just a tool for implementing that common sense notion?  You multiply the probability of an attack by the consequences if the attack succeeds. That gives you a good idea of the most you should spend to prevent the attack.

LANGNER: Sure, in theory.  Except no one really knows how to measure the probability of a cyberattack. You can always find people who’ll say, “It hasn’t happened yet, so the probability is practically zero.”  And that’s a very welcome message, because it dramatically reduces what the company has to spend on cybersecurity. Add to that the fact that IT departments have little understanding of control systems. Many CIOs wouldn’t even recognize a PLC if they saw it, let alone implement cyber security for its operation.

BAKER: And that saving goes right to the bottom line. At least until something bad happens.

LANGNER: Right. You almost can’t blame the companies. They’ve got a theory that makes sense. They’ve got a big mathematical model that no one really understands. They can show it to the regulators (in regulated industries). But it’s all built on guesswork mixed with self-interest.

BAKER: Garbage in, profits out.

LANGNER: Well …

BAKER: But that kind of risk analysis is nuts. Cyberweapons are not hurricanes or tsunamis. They’re the product of a thinking adversary. If you want to judge the likely actions of a thinking adversary, you should ask first what the adversary’s capabilities are and then whether he could gain an advantage by using those capabilities. We know the answers to those questions, and they aren’t pretty.

Capabilities? No one has been able to prevent network intrusions on IT systems, and Stuxnet shows that moving from intrusion to destruction is the easy part.

So the only question is whether an adversary will find it advantageous to attack our civilians with cyberweapons. and the answer is obvious: Considering how easily Western civilians succumb to war weariness even when they’re not suffering directly, surely some of our adversaries will be tempted to jump-start our war weariness with a direct attack on our civilians. Especially if they think they can stay anonymous or blame the attack on someone else.

Using that analysis, I’d say that the likelihood of a civilian cyberattack in the next decade is closer to 100 percent than to zero.

LANGNER: But that’s not how the risk management models are being used. I talk to the companies that manufacture industrial control systems, and they say there’s no demand for more security. My company provides security consulting for asset owners, and implementing our recommendations may ultimately cost around $1-$2 million per site for complex plants. Even that is too much for a lot of companies, which can seem bizarre at times if you realize that it may be just one percent of the acquisition cost of large-scale control systems.

BAKER: I always thought that the real sticking point for these systems was the need for operational reliability. If the plant goes down at midnight and the system expert is skiing in Vail, you can’t wait 12 hours for him to fly back. The system has to be easy to access remotely. And you can’t change the default passwords either, in case the system expert skis into a tree.

LANGNER: I think the operational reliability excuse is seriously overdone. There are plenty of ways to get both security and reliability. And it can certainly be argued that an insecure system cannot be reliable. By the way, secure remote access has been technically solved for years, and the claim that a default password, perhaps even hard-coded into the product, would increase reliability, seems like quite a stretch.

BAKER: But isn’t it dangerous to monkey with an industrial control system once it’s up and running? There are only two possible outcomes, after all. Either nothing changes or something breaks. Not exactly an incentive to upgrade security.

LANGNER: Well, sure, that’s the first rule of industrial control systems: Never touch a running system. Having that said, it’s worthwhile to explore the deeper reason for this rule. The sobering fact is that more often than not, those systems are not supposed to be “touched” because they are so fragile that maintenance engineers have lost faith in their ability to continue operation under even slightly modified parameters. The fact of the matter is that in the majority of installations, system behavior is actually no longer understood (and certainly not documented properly). While you may want to live with that in IT, you probably will experience some discomfort when thinking about nuclear power plants. Anyway, there’s still plenty you can do to improve security without touching the design of the system.

BAKER: Such as?

LANGNER: The kinds of mitigation measures we recommend for running systems are policy and architectural: network segregation, restricting contractors’ remote access, security training and awareness, and the like. Some of the most serious problems are actually quite easy to address, such as the BYOD problem. It is “worst practice” for most critical infrastructure facilities to allow contractors to bring their ill-secured laptop computers and connect them to the most critical systems. Any sophisticated cyber attacker would not try to infiltrate a power plant, water plant etc. directly, but indirectly via contractors who will then spread custom malware without knowing it.

BAKER: That still doesn’t sound like a complete solution. Surely you have to design the PLCs themselves not to run unapproved code or accept instructions from unauthorized sources. That means you’ll have to swap equipment on running systems, doesn’t it?

LANGNER: I don’t recommend widespread security upgrades on running systems. Look, there are easier ways to improve industrial control security.  New industrial control systems are being installed every day. New power plants are being built, along with new pipelines and new refineries. Let’s start by securing them. Seriously, one of the most bizarre aspects of cyber insecurity in critical infrastructure is that we continue using those insecure-by-design systems in new plants that will be here to stay for another twenty or thirty years. Our highest priority should be to make sure new systems meet high cybersecurity standards — as they have to meet higher environmental standards than existing plants.

BAKER: That’s clever. Where electricity is concerned, a lot of us could limp along in an emergency with maybe 20% of the power we’re used to. So if we start by securing new sources of power, it won’t be long before we have a much more resilient infrastructure. And that would make us a less attractive target.

The same is true for a lot of infrastructure. Losing all of it would challenge societal survival; losing part of it would merely be inconvenient.

In the same vein, let me ask you about my pet proposal. The Obama administration is forcing the closure of the old coal-burning power plants that were grandfathered under the Clean Air Act. I assume that after they’re decommissioned, the plants will be scrapped and the land redeveloped. Wouldn’t we be better off if the old plants were mothballed instead, like aging battleships that might be needed again? These are old plants built with analog equipment that is mostly immune to cyberattack. If we simply left them in place, ready to fire up in an emergency, it seems to me that we’d be well on the way to a 20% emergency reserve of generating capability.

Am I right to think that these old plants are worth saving?

LANGNER: Absolutely. Digital control systems weren’t common in the 1970s or earlier.  Even today, there are hundreds of plants still running analog safety control systems. Those old coal-fired plants are almost certainly protected against most cyberattacks simply because they don’t use the insecure digital technology. Different from a kinetic attack, a cyber attack can only impact vulnerable digital systems. If you have less of those systems, you’re less vulnerable.

BAKER: Wow. I’m actually starting to feel better. It sounds as though we could do a lot to harden our infrastructure and make a nightmare war on our civilians less likely.

LANGNER: All true. Except that we’re not actually doing any of this. Unless the government intervenes, profit-making businesses are going to keep lowballing their expenditures on security.

BAKER: What should the government do? Do you really think regulators can keep track of new attacks and new defenses and then write security rules as fast as the bad guys write code?

LANGNER: Regulating security measures is hard, but not impossible. In some ways, the Nuclear Regulatory Commission does the best job. In its very brief regulatory document (10 CFR 73.54), it simply tells operators of nuclear power plants, “You must show that you’ve addressed the cybersecurity problem.” The smart thing is that NRC doesn’t exactly tell operators what to do, but simply requires them to demonstrate verifiably that they did their homework on cyber. The specifics of the nuke plant’s cyber security plan, which can be totally idiosyncratic for a plant or utility, then become a foundation of the operating license, and their implementation will be checked by NRC inspectors.

BAKER: In a way it’s like the financial regulatory process, where government examiners ask each institution about its security plan. The examiners don’t require any particular measures, but when one institution tells them about a security measure (and what problem it solves), a good examiner asks the next bank he inspects whether it has adopted the same security measure as the first bank, and if not, how it’s solving the same problem. That bank doesn’t have to have the same security measure as the first, but does need a good answer to the second question. The result is a flexible and constantly improving security posture. At least if you’ve got really smart and engaged examiners.

LANGNER: Mm-hmm. What’s clear is that the market isn’t solving this problem. Most of the suppliers of industrial control systems are old electricity equipment makers. Their core competence isn’t software, and they aren’t naturally adept at software security. They aren’t likely to push solutions out to their customers.

BAKER: Really? There’s nobody you think is doing a good job on that front?

LANGNER: None of the big guys. In ICS security, small vendors definitely are the champions, such as Schweitzer Engineering Labs who have understood that even hardcore electrical systems such as high voltage relays absolutely require top notch cybersecurity. On the customer side, too, there are a few companies that have built cybersecurity certification into future purchases. Shell is most notable; it has set cybersecurity criteria for future purchases of industrial control systems.

BAKER: Are you saying that it’s the Americans who are talking about cyberwar, but it’s the Europeans who are actually doing something about it?

LANGNER: If you would want to benchmark, certainly the US is the global leader. Unfortunately, like any benchmark, that still doesn’t tell you if you’re good enough.

BAKER: Ralph, thanks for the conversation. Considering how depressing the topic has been, it still leaves me with a little hope. Defending against infrastructure cyberattacks isn’t an impossible task. There’s a lot we can do.

And once industry and government have tried everything else, I expect they’ll do what’s necessary.

LANGNER: That’s my hope as well.

Powered by WordPress. Designed by Woo Themes