11757:full

The Solar Winds Wake-up Call

Image: Gorodenkoff – stock.adobe.com
February 19, 2021

The massive Solar Winds hack—discovered almost by accident by the private security firm, FireEye—demonstrates all too graphically just how permeable the nation’s cyber defenses are.[1] The use of terms is confusing, but it is worth distinguishing between cyber espionage, breaking into networks to exfiltrate information, and cyberattack, when the purpose of the break-in is to cause physical damage and possibly hurt people as well. Cyberattack is an act of war under international law; the 2010 Stuxnet attack on Iran, reportedly a joint U.S. and Israeli operation, qualifies because it destroyed Iranian centrifuges. So far, the Solar Winds hack is “only” espionage.

By Gregory F. Treverton

Note: The views expressed here are those of the author and do not necessarily represent or reflect the views of SMA, Inc.

Yet the rub, as one expert put it to me, is that “cyber espionage is cyber espionage until it isn’t”—once a hacker breaks in, who knows what might be left to cause damage later. If this had been an attack, it would have been orders of magnitude worse than the previous worst, NotPetya, in 2018.

What We Know…and Don’t Know

Let’s start with what we know and don’t know about the Solar Winds hack. We are tolerably certain that its origin was Russian, probably the foreign intelligence service, SVR. FireEye hasn’t made the call, but U.S. intelligence has pointed to the SVR. What we are certain about is that the hack was very sophisticated. There seems no evidence that it was an “inside” job, but it was a very big operation, perhaps involving a thousand Russian hackers, who were very good at hiding their actions. They made use of infrastructure inside the United States, what are called “hot spots” in the trade. But that doesn’t excuse the failure.

The hackers got into Solar Winds, a part of the information technology (IT) supply chain and attached malware onto a routine security update. The update, and with it the malware, went to 18,000 institutions, including at least six government agencies. The Russians could thus pick and choose targets; in the words of another expert, they “got the pick of the litter.” We also know that this was an enormous intelligence failure, and also a failure of the “Defend Forward” strategy, announced in the 2018 Department of Defense Cyber Strategy. The strategy aims to put American beacons in adversaries’ networks, especially those of China and Russia, to learn of and stop attacks before they happen. It plainly failed, for the government’s premier cyber agency, the National Security Agency (NSA), had no idea what was afoot.

What don’t know is how many institutions, of the 18,000 vulnerable, were actually hacked. The number is at least in the tens and perhaps hundreds. The Department of Homeland Security (DHS) apparently is currently investigating a hundred cases. Alas, this unknown comes as no surprise, for the current state of affairs provides only weak incentives for companies to report: why give your competitors a gift by revealing that you’ve been hacked? FireEye has been very transparent; another company involved, Palo Alto, has not.

We also don’t know what the hackers left behind, and if or when cyber espionage might turn into cyber attack. Nor do we know what information the hackers gained access to, or how they plan to use it. We cannot be sure whether we have found and sealed every entry point they used. And the biggest question remains: what was the motive? Was it “only” cyberespionage?

What Next?

The challenge of improving cyber defenses is that cyber policy requires working across three divides—offense/​defense; civil/​military; and public/​private. On the first, while very few people know how much the nation spends on offense, the hack surely suggests that it spends too little on defense. Of the three, the last is especially critical: virtually all of the infrastructures, including IT, that adversaries might hack are privately owned but constitute an enormous public interest.

For starters, critical government cyber institutions need to be beefed up. One is CISA, the Cybersecurity and Infrastructure Security Agency—the agency that, like New York, New York, likes “security” so much it uses the word it twice in its title. Its Einstein tool monitors federal networks, looking for malware “signatures”; in that task it has the benefit of access to signatures from both the commercial world and U.S. intelligence. In this case, though, the signature was new, thus not recognized. Einstein’s second line of defense is noticing unusual behavior on the networks, but that capability is limited: the tool can recognize a thousand pings in an hour as an anomaly but not, say, unusual size or other attributes of an update. It needs to be upgraded. So, too, CTIIC, the Cyber Threat Intelligence Integration Center, also could be central—especially in connecting intelligence to the private sector—but it also is under-funded.

Two immediate imperatives might require legislation. Solar Winds underscores the need for more transparency and accountability in IT supply chains. It is simply imperative that institutions know where the software and updates are coming from—and preferably not from Belarus as in the Solar Winds case. The place to start is with government contractors, for suppliers in the Defense Industrial Base are already held accountable.

With regard to breaches, while there is some requirement for companies to report to the Securities and Exchange Commission (SEC), those requirements need to be tightened. In my conversations with, for example, big financial houses, I get the sense that the country is caught in a kind Prisoner’s Dilemma: the companies say they can afford to simply absorb five percent losses from cyber intrusion; better stay silent than push customers to their competitors by announcing a hack. Yet what may be acceptable for Chase or another individual company leaves to country under-protected when that same pattern is repeated across many, if not most, companies. It would make sense to tighten reporting requirements and couple that with a group like the National Transportation Safety Board (NTSB), probably located in CISA, to investigation major breaches.

The creation of the new National Cyber Director offers to opportunity to do better at working across the public/private divide. Already, the IT Information Sharing and Analysis Center (ISAC)—one of a number of ISACs bringing companies together to share best security practices—works pretty well, in conjunction with the Department of Homeland Security (DHS). Legal modifications in the last few years have made it easier for companies to share best practices without running afoul of antitrust laws.

Finally, while the task is daunting, it is past time to work on international rules of the road in the cyber world. The 2015 agreement between the United State and China not to conduct cyber espionage for commercial purposes was a start. It did result in fewer such Chinese cyber operations, at least for a time until Sino-American relations turned sour across the board. Washington might start with like-minded partners, then reach out to China and Russia.

As is the case with arms control, measures that constrain them are always attractive; the challenge comes with the realization that the measures also have to apply to us. A ban on cyber espionage isn’t going to happen, for we and many countries engage in it, and reaching for “no attacks IT supply chains” probably also is a bridge too far. But attacks on infrastructure are a place to start. It is worth a try, for while the United States now is probably more dependent on the cyber realm than any other country, the dependence of all countries will only grow.

[1] This article draws on remarks I made to an Aspen Institute Congressional Program breakfast, February 10, 2021.

Get the inside track with our Experts on Demand. SMA gives you access to experts covering a breadth of market areas and end customers. Their experience and guidance will give your project the competitive edge.
Edited by Dick Eassom, CF APMP Fellow
Published on February 19, 2021, by SMA, Inc.