30 July 2019

Richard Clarke is sounding the alarm about another kind of 9/11

BY ALEX PASTERNACK

Richard Clarke knows a few things about clear and present dangers. He had already served under six presidents and been appointed the U.S.’s first counterterrorism czar when he joined the George W. Bush White House, but when he tried to alert important decision-makers before September 11 about the threat of a terrorist attack on U.S. soil, those warnings were largely ignored. (Afterwards, he famously apologizedpublicly for the government’s failures.) These days, Clarke is still trying to get people to think hard about the next big attack—the cyber version—and all the ones that have already happened.

Clarke’s new book, The Fifth Domain, written with cyber expert and fellow White House veteran Robert Knake, is in many ways a follow-up to a book they wrote in 2012 called Cyber War. That book was derided by some at the time as science fiction, Clarke laments; now he has the benefit of a sci-fi-like series of developments to illustrate his case. And yet, while we’ve been blindsided by a slew of giant hacks, thefts, and attacks, the prospect of a Cyber 9/11 or Cyber Pearl Harbor is still hard to grasp. Maybe those are the wrong metaphors.


As Eric Rosenbach, a former Pentagon cyberczar, tells the authors, “The big cyberattack is going to be something that undermines our democracy in a way that leads Americans to question the viability of our system.”

Clarke says that’s exactly what happened in 2016, when Russian agents hacked emails, spread propaganda, and launched attacks on voting systems to interfere with the U.S. election, and we can expect a lot more of that going forward. He also worries cyberwar could lead to a shooting war too, especially now that the Trump administration has taken a more aggressive stance to confrontations in this new, fifth domain of warfare. “Every time I see the U.S. and Iran getting closer to something, I worry that it could get out of control because I don’t think we’re ready,” he says.

Either way, there is no putting cyberweapons—and especially potent military worms like Stuxnet, which was used to attack Iran’s nuclear centrifuges in 2012—back in the box. That doesn’t mean being a victim is inevitable, says Clarke, who has his own corporate cybersecurity consultancy. Companies and government need to be more resilient, and Clarke and his coauthor offer 80 recommendations on what the U.S. should be doing, including enhancing standards for the power grid and outsourcing some government cybersecurity to a growing private industry. I recently spoke with him about those suggestions and other ideas and concerns raised in the book; our conversation, which happened by phone and email, has been slightly edited for clarity.

Fast Company: You and others warn about a cyber 9/11. Do you see parallels with your time in the White House, when you were raising concerns about the terrorist threat that culminated in what happened on September 11?

Richard Clarke: The terrorist threat we saw evolving during the ’90s, and we pretty much nailed it, in terms of saying the way it was going to evolve. During the Clinton administration we were able to get people’s attention, and we quadrupled the amount of money the federal government spent on counterterrorism. We set up new organizations. We spent a lot of money on domestic capabilities to deal with chem and bio attacks. And we did worry about security in the air. People would never take it seriously enough to do the things we suggested about the air side, which was have the federal government federalize the screening of people at the airports. Not a chance. Let’s think about maybe having air defenses in Washington. Not a chance.

I did a book two years ago called Warnings where we did 14 case studies where experts, data-driven experts, predicted things would happen, and no one paid attention, no one acted on the basis of their warnings. And we asked the question across those 14 case studies. And we asked the question, why is that happening? And the most powerful reason they came up with was that the thing that we were talking about had never happened before.

We came up with a fancy term for this—I think it was Initial Occurrence Syndrome. And after the 2016 election, when she was drinking a lot of Chardonnay, Hillary Clinton read the book. And so when she goes to write her book, What Happened?, she refers to Warnings and says that even though there’s evidence that something’s about to happen—if you’re warned by experts—and it’s never happened before, you don’t take it seriously. And that’s what happened in the case of the Russian interference in the election. That’s what she says.

I think with cyberwar, we have the same kind of phenomenon. We’ve got lots of experts, lots of data, predicting significant cyberactivity, and we’re paying lip service to dealing with it. People are ignoring it. As we point out in the book, there are some companies who are spending a huge amount of money, and they’ve actually achieved security—the dogs that do not bark. But in general, people are not paying enough attention to it, because there has never been that big event. Or it’s never been a big event to them.

FC: It seems like a corollary to a sort of “failure of imagination,” which was mentioned in the 9/11 report as well.

RC: And I kind of resented that because I thought we did pretty well imagining. It was certainly imagined in the case of cyberattacks. We also have in the book a little fictional segment set in the near future of how an Israeli-Iranian confrontation spirals out. So I think plenty of people have plenty of imagination about how it could happen, but it’s never happened in the big way yet. But it’s happening more and more and more.

FC: You have written fiction, and I heard your pre-9/11 concern about planes-as-missiles came from a Tom Clancy book. Fiction seems important here.

RC: Clinton too got some of his ideas about threats from fiction. As an occasional fiction author, I think fiction is sometimes the best way to reveal the truth.

FC: In terms of getting people’s attention . . .

RC: When we wrote Cyber War 10 years ago, the Russians had used cyber against Estonia, they had used it against Georgia, but that didn’t really get the world’s attention. Now, in just the last couple of months, the U.S. admits, we launched a cyberattack against the Russian Internet Research Agency. The U.S. admits that we’ve hacked our way into the controls of the Russian power grid. The U.S. admits that we responded to the downing of our Global Hawk drone by a cyberattack on Iran. So shots are being fired, and they are being fired a lot. But still, people can’t imagine the big ones.

FC: I guess some people have the imagination, but the people, maybe the people who need to make the decisions aren’t imaginative enough themselves, or aren’t hearing out the concerns in a reasonable way?

RC: There is some downside with acting before there’s a crisis. We talked about it in the Warnings book. If you act too soon, and you succeed in stopping the crisis, people will think you were crazy and wasted money. I don’t know what you think about Y2K—I happen to think that would have been a problem if we hadn’t spent all the money and done the whole things we did. But because we prevented that problem, I think a lot of people, when you say Y2K, say, oh yeah, that was a joke.

FC: And all of the attacks that have been prevented too, no one knows about those. As to how to respond to a crisis, as you mentioned, the response seems to be changing by the month under this administration. You discuss in the book how for too long we were too focused on defenses and not enough on offense.

RC: Well, in one narrow sense, and that is U.S. Cyber Command. U.S. Cyber Command was tied up in knots, legally and in terms of policy. The Obama administration, after Stuxnet happened—and they kind of felt burned by Stuxnet—they put a lot of restrictions on Cyber Command. And we think that probably that was too much. But now the pendulum has swung far in the opposite direction, which is that the White House no longer has to approve things. To devolve all of that decision-making for Cyber Command down to the Pentagon itself, that’s probably too extreme.

FC: The U.S. has acknowledged that Moscow already has the ability to shut down parts of the U.S. power grid. In terms of the correct response, I think of another parallel back, back to 9/11, which is the issue of who is responsible for stopping the bad guys. I think about how in the lead-up to 9/11, the CIA was tracking hijackers within the United States apparently for the interest of gathering intelligence and effectively kept that information from the FBI. I wonder if you see a parallel there in terms of how we respond to cyberintrusion today.

RC: There is a deconfliction mechanism. And there was, by the way, for terrorism before 9/11. The CIA just didn’t use it on one critical occasion. But there’s a deconfliction mechanism now for cyberattacks. So that if the CIA wants to do something, they have to talk to NSA and Cyber Command, and vice versa, so they don’t reveal each other in effect. I have no reason to believe that’s not working.

FC: What is crucial right now in terms of defending critical infrastructure, from the U.S. perspective, that we’re not doing?

RC: The most important thing is obviously the power grid, and along with that the natural-gas pipelines on which the power grid depends. And if you look at our regulatory structure, the federal government only regulates a piece of the power grid for security. I think it should regulate the whole thing, including the gas pipelines. It should have third-party auditing on a regular basis. And it should increase the level of the security regulations and segmentation of devices on the network. Because when the Director of National Intelligence testifies, as he did this year, that the Russians are in our power grid, that should have sounded an alarm—that should have led us to say, Okay, let’s get them out! Where is the big national program to get them out and make sure they can’t come in again?

If we had said a Russian army division is in Kansas, maybe people would have reacted? Well, the Russian cyber GRU outfit is in our power grid? It’s like, “Oh yeah, I’ve heard that before, so what.”

FC: You’ve worked in the administrations of four presidents, from Reagan through George W. When you think about your experience there and you look at the White House today, what are you worried about? What do you think about?

RC: When the Trump administration began, they had a really good guy doing the cybercoordination job, the so-called cyberczar job. They didn’t give him all the power that, frankly, I had, but he was a good guy, and he had the right title. And then John Bolton came in and fired him, and didn’t replace him. The guy was named Rob Joyce.

FC: The NSA veteran.

RC: And he went back to NSA and he’s back there now. I’m sure he’s doing good things back there. The White House wrote a nice national policy or national strategy for cybersecurity, but how are they going to implement it when they’ve got no one in the White House to do that? They did the same thing in the State Department. There were people working on international norms and maybe someday arms control. And they got rid of that and demoted the office.

So I don’t see a lot happening. They agreed to taking little bits of DHS and working on cyber and putting them together under a label that says Cybersecurity Agency. But they only agreed to do that if there were no new resources spent. So we’ve got a little cybersecurity agency over at DHS, and the guy in charge of it, [Christopher] Krebs, is pretty good, but he doesn’t have anywhere near the resources he needs.

What we propose in the book is, rather than having every federal agency try to protect itself, we take that cybersecurity agency at DHS and give them the job, give them the resources they need to do it, but for all the little federal agencies and departments—not the Dept. of Defense. For all the ones that aren’t up to it—like OPM [the Office of Personnel Management]—have it be a managed security service. Have it be outsourced.

If you look for example at the way the states do IT, many of the states have one IT department. And if you’re working in any of the state agencies or departments, you have to worry about IT. It’s a service. We ought to do that for a lot of the federal, and create a senior cyberservice. We’ve got a senior executive service and senior intelligence service, the senior foreign service. It’s time for a senior cyberservice. And they all ought to work out of the cybersecurity agency. And they ought to be the CISOs [chief information security officers] at all the agencies and departments.

FC: On the private side, meanwhile, I wonder how you see shifting the way companies think about cybersecurity and protecting user data? How does money come into play in terms of incentivizing good behavior?

RC: It’s all about money. The companies that have achieved cybersecurity have spent somewhere between 8 and 10% of their IT budget every year. And the companies that are being hacked are down at the other end of the spectrum, spending 3 or 4%. If you spend 3 or 4% of your IT budget on cybersecurity, you’re going to be hacked. You already have been, and you don’t know it. So it is about money.

It’s also about governance. Where does the CISO—the chief information security officer—sit in the hierarchy? To whom does she report? If she’s reporting to the CIO, then you have a problem because you have conflicting interests. And if the CISO says, I need this level of support financially, or in terms of policies and procedures, and they don’t get it approved, the fact that they were denied those resources needs to be told to the board of directors. And there needs to be somebody on the board of directors who is cybersavvy and there needs to be a cyberbriefing every quarter at least to the risk committee of the board. And companies that get the governance part right—where does the CISO report, what’s the role of the board, is there a cyber guy on the board?—companies that do that right, they don’t get hacked.

FC: In terms of motivating that, and imagining the worst, I wonder if there are financial incentives that can come into play. Obviously, there’s a big financial incentive to not get hacked.

RC: I’m not in favor of tax relief, because that just turns out to be an expenditure. I think the financial incentive is that you’re not gonna lose hundreds of millions of dollars and reputational damage when you get attacked.

We talk in the book about cyberinsurance and how it could do more than it’s doing. And cyberinsurance could set standards, if they were willing to provide greater coverage than they are. They’re not willing to create greater coverage right now, because they’re afraid they don’t understand the risk. They’re afraid of the cyber 9/11 where everybody gets hit and bankrupts the insurance company. So they limit the policies. We had the same problem with terrorism. And we created a government backstop in case there was a major terrorist attack. We backstopped the insurance companies. We call for the U.S. to do the same thing for cyberattacks.

FC: It makes me think of privacy issues too. It’s hard to imagine what the cost is until the attack has happened. It’s hard to imagine what it means to lose millions of records of data, even after you’ve lost them, if you even know you’ve lost them—and even how to put a price on that—so it goes back to the issue of imagination.

RC: Okay, so if you look at the NotPetya attack, it erased software on every device on the network. You can price out the cost—companies were down, companies were completely down, some of them for weeks. And the insured damages estimate on the NotPetya attack is $10 billion. That’s just the insured damages.

FC: The ransomware attacks seem like an interesting case study in how to respond, because some people are paying ransoms and some aren’t, and at great cost.

RC: I think what’s interesting to me about ransomware is it’s picking off the low-hanging fruit. You know there’s that old joke that you don’t have to outrun the bear, you just have to, if there are three or four of you running, you just to run faster than the other guy. Well, that’s kind of the case with ransomware. Ransomware is picking off the slow runners. Ransomware is picking off the people who are spending 3 to 4% of their IT budget on security.

FC: Some have reported that the Baltimore attack was actually the result of a stolen NSA tool. A company in Abu Dhabi called Dark Matter hired a bunch of former NSA people to basically build a private intelligence service. I wonder how you think about the potential for these weapons to get loose, and for experience and talent to get loose?

RC: Well, one thing it tells me is that security at NSA hasn’t gotten any better since Snowden. After the Snowden attack, Obama asked me and a couple of other guys to go into NSA and figure out what went wrong. And we gave them a whole series of recommendations on how to improve their security, and Obama approved all of those recommendations. But we find that contractors are walking out the building, going home with attack tools, in their briefcases. That should never be allowed to happen. There’s easy technology to make sure that doesn’t happen. If you’re gonna make weapons, you’ve got to secure them. If you’re going to make nuclear weapons, you have to have high security. Well if you’re going to have cyberweapons, you have to have high security. And when we let these Booz Allen contractors walk out of the building with the weapons, that’s criminal. We should no more let them walk out of the building with cyberweapons than we should let them walk out of the building with nuclear weapons.

FC: But it’s so easy to do. It seems that compared with something like nuclear weapons, on a practical level, it seems very difficult to control these weapons.

RC: No, not really. No. They’re, they’re on government servers and you can easily put software on servers that prevents the weapons from being downloaded. There’s no reason why these guys should have been able to download this stuff. Alarms should have gone off. It bothers me a lot.

Arms control, we think, is going to take a long time, but it’s necessary. And if you don’t start now, we’re never going to get there. You could start with simple risk reduction measures, with confidence-building measures, and some international norms. Start there, but there’s no dialogue right now.

FC: What do you think about private companies becoming involved not just in cyberdefense but in cyberoffense? I think of firms like Dark Matter and NSO Group but also PSY Group and Cambridge Analytica . . .

RC: We need to do a better job enforcing the Arms Export Control Act when it comes to cyberweapons. We may need a new law to clarify the limitations, but offense should be left to the government. Otherwise, deconfliction becomes a problem.

FC: Some of your emails were leaked a few years ago. I wonder if you’ve been personally affected, I mean beyond that, by war, by cyberattacks, and how that’s changed the way you see this . . .

RC: What happened to me was my computer was never hacked. The guy who was sending me emails and receiving mine in return, he was. And what that demonstrates is that the security of your communications whether it’s a text or email or voicemail is only as secure as both ends of that conversation. And I didn’t mind so much what came out of my emails because I thought I looked pretty good. But you have to consciously be aware all the time that anything you put in an email or text can be publicly exposed. And if you’re the slightest bit a public figure, you know, they probably will be exposed.

FC: Did that change your behavior in terms of security? Or has the writing of this book changed your thinking about any of these issues?

RC: I think when I started working on cybersecurity back in the White House, that changed my attitude a lot. And I realized from earlier scandals involving Ollie North and people like that in the White House that White House emails have a good chance of coming out. And the 9/11 Commission did its investigation, it pulled all of my emails, and even got the White House to declassify some of them. And they were revealed, including some things like me saying that the director of CIA has bipolar problems. So it was a little embarrassing. I learned that you gotta be careful with what you put in an email, because even in the most secure environments things can get out.

FC: If you can go back to that time in the White House, could you have imagined that we would be in the situation we’re in now on the cyberterrain, with Iran, China, Russia—and Trump?

RC: What happened was, after the Oklahoma City terrorist attack [in 1995], it came around the same time as a chemical and biological weapons attack by a terrorist group in Tokyo.

FC: Sarin.

RC: Bill Clinton said to us, you know, what if somebody put together a big attack like Oklahoma City with chemicals and biological weapons, and what if they blew up, not just a random federal building, but like a key node for something like telephones or something? Do we know where the key nodes are? And we said, no, we’re basically not ready for this.

And he said, okay, let’s set up a presidential commission and look into it. I thought that commission was going to look into critical infrastructure, meaning things like critical nodes of the telephone network or critical bridges over the Mississippi or something. But that commission came back and said, Hey, something new is going on now. Everything is moving to the internet. Controls for all sorts of important things are moving to the internet. And the internet is not secure, and that means that people can attack over the internet. My mind was blown by that. That was 1997.

But that commission was prescient. And then I started learning and beginning in 1997, about what could be done and spent a full day at the NSA getting read into everything that they could do. I came back to the White House and reported basically, if the NSA is after you, they’re going to get you, there was no way that they could be stopped. And while we may be the only ones right now that can do that, I said, other people are going to be able to do that. And we are not ready. That was 1997.

No comments: