The Vulnerability Economy- Zero-days, Cybersecurity, And Public Policy

May 28, 2016 | Author: Ayejuyole Sola | Category: Types, School Work
Share Embed Donate


Short Description

The Vulnerability Economy- Zero-days, Cybersecurity, and Public Policy...

Description

KS1013 HKS Case Number 2029.0

The Vulnerability Economy: Zero-Days, Cybersecurity, and Public Policy Inspiration Strikes: Stuxnet 1

Dillon Beresford saw Stuxnet as a challenge. In the summer and early fall of 2010, news of “Stuxnet” appeared in headlines across the globe. At first, vague accounts circulated among computer security professionals: 2

VirusBlokAda, an anti-virus company based in Belarus, spotted an odd, malicious new worm. The purpose of the 3

worm was unknown. At first blush, it was not clear what precisely this new piece of malware targeted. The appearance of a new piece of malware, in and of itself, is not particularly newsworthy. New viruses, worms, and 4

other forms of malware are regularly discovered. But, as researchers began to analyze the worm, later christened Stuxnet, they quickly realized that they had stumbled onto a much larger story. Stuxnet appeared to have a singular, very specific target in its sights: uranium enrichment facilities located in Iran. 5

The discovery that Stuxnet targeted Iran’s nuclear program sparked an international media frenzy. The intrigue was impossible to ignore: Had a piece of malware sabotaged Iran’s nuclear efforts? Had lines of code been sent to achieve what sanctions and diplomacy could not? Press accounts dubbed Stuxnet as a new breed of “cyberweapon”—an opening salvo in a new era of “cyberwar.”

6

7

Stuxnet attacked Iranian uranium enrichment facilities in Natanz. The mechanics of the worm were impressive; it relied on four previously unknown flaws, what are known as “zero day” or “0-day” vulnerabilities, in 8

Windows software to gain access to vital industrial control systems involved in uranium enrichment. Industrial control systems are computerized systems that are used to control and monitor physical systems; they are widely 9

used in many industries, including electric power, water, transportation, and manufacturing. Stuxnet was ingenious. It altered key components of the industrial control system, devices known as programmable logic 10

controllers (PLCs), that controlled the operation of centrifuges used in enrichment. By manipulating the PLCs, Stuxnet was able to speed up and slow down the operation of the centrifuges at will, while camouflaging these changes from increasingly confused Iranian operators. To on-site observers, the centrifuges appeared to be 11

operating as planned; system monitoring reported no unusual activity. Yet, the damage was real. The fluctuations 12

in speed led to the temporary destruction of roughly 20% of the centrifuges operational in Natanz. A cyberweapon caused physical damage. Estimates suggested that the damage at Natanz set back the Iranian nuclear program by between 18 months and two years.

13

This case was written by postdoctoral fellow, Ryan Ellis for Venkatesh Narayanamurt, Benjamin Peirce Professor of Technology & Public Policy at the Harvard School of Engineering and Applied Sciences, for use at the Harvard Kennedy School (HKS), Harvard University. HKS cases are developed solely as the basis for class discussion. Cases are not intended to serve as endorsements, sources of primary data, or illustrations of effective or ineffective management. (January 2014) Copyright © 2014 President and Fellows of Harvard College. No part of this publication may be reproduced, revised, translated, stored in a retrieval system, used in a spreadsheet, or transmitted in any form or by any means without the express written consent of the Case Program. For orders and copyright permission information, please visit our website at www.case.hks.harvard.edu or send a written request to Case Program, John F. Kennedy School of Government, Harvard University, 79 John F. Kennedy Street, Cambridge, MA 02138.

14

Commentators suggested that Stuxnet must be the work of a nation-state. Only a nation-sate, or possibly collaboration among nations, could harness the resources needed to create something this sophisticated. Creating Stuxnet, it was charged, must have been incredibly costly and difficult. Zero-day vulnerabilities, particularly in a product such as Windows, are difficult to find. On the black market, a single Windows zero-day can 15

sell for six figures. Typical malware might exploit one previous undiscovered flaw, but four? The use of four 16

different zero-days in a single piece of malware was unprecedented. The expertise or money needed to either discover or procure multiple zero-days was, it was assumed, beyond the scope of non-state actors. Additionally, in order to disguise itself, Stuxnet used two different stolen digital certificates. Digital certificates work as authorization mechanisms that validate software; without a valid certificate, malware might be quickly uncovered or blocked from running. While forged certificates had been seen before, the theft of legitimate certificates was 17

novel and suggested the work of a very resourceful author. Finally, Stuxnet demonstrated a great deal of understanding of the configuration of the control systems and PLCs at Natanz, information that was hardly widely18

known. Taken together, observers suggested that only a nation-state could pull off something as complex as Stuxnet. Who else could possibly marshal these vast resources in such a sophisticated manner? Dillon Beresford was not convinced. News of Stuxnet was inescapable; the popular press covered it breathlessly, providing more and more narrative detail. At the time, Beresford was working as a security researcher at NSS Labs in Austin, TX. While the accumulated evidence certainly pointed to the work of state actors, Beresford took the claims that only a state could pull off something like Stuxnet as a challenge. Stuxnet inspired him. Was creating a sophisticated cyberweapon such as Stuxnet really only possible with the backing of a nationstate? Maybe he could do it? Maybe it was not quite as hard as it appeared?

19

Beresford set out to see if he could find a previously undiscovered flaw—a zero day—in PLCs similar to those that had been attacked by Stuxnet. He was not interested in actually disabling control systems operating in the field or causing physical destruction. Rather, he was motivated by curiosity. Was it really so difficult to disrupt industrial control systems? To be sure, a government organization was likely behind Stuxnet. Reporting by David Sanger at the New York Times, as well as others, identified the United States and Israel as the key players behind the worm. But that did not necessarily mean that a Stuxnet-type attack was beyond the reach of non-state actors. Perhaps clever and enterprising individuals could do something similar. Beresford decided to see for himself.

20

Searching for Zero-Days Dillon Beresford got to work. He borrowed $2,000 from NSS Labs and purchased a few Siemens’ SIMATIC 21

Step 7 (S7s) PLCs from PLCTrainer.net, an online vendor. Stuxnet had also targeted Siemens’ S7s. Working alone 22

in his apartment late at night after leaving NSS, Beresford started looking for flaws. These were not specialized laboratory conditions. Beresford later wryly described his no-frills workspace as an “apartment on the wrong side 23

of town where I can hear gunshots at night.” After two months of near-sleepless nights, he found what he was 24

looking for. Working in his spare time, and armed with little more than his laptop and the purchased S7s, Beresford found multiple previously undiscovered vulnerabilities.

25

26

It was a significant discovery. The vulnerabilities impacted multiple lines of Siemens S7 PLCs. Siemens is a leading global provider of control systems and automated equipment. In FY2012, its industrial sales generated HKS Case Program

2 of 11

Case Number 2029.0

27

$25 billion in revenue. The vulnerabilities Beresford uncovered had the potential to cause serious damage. The S7 PLCs are widely used in a variety of infrastructures and industrial processes, including electricity distribution, 28

wastewater plants, manufacturing, and transportation systems. The new vulnerabilities enabled an enterprising attacker to craft exploits—code that takes advantages of a particular vulnerability—that could be used to 29

effectively gain “complete control” of the PLCs. Now, an attacker could reprogram PLCs without authorization and control their operation, shutting down PLCs which were supposed to be working, or, perhaps even more troubling, manipulating PLCs to perform operations at the discretion of the attacker. Beresford also found a flaw that would allow an attacker to change the password protecting a PLC without the operator’s knowledge, 30

effectively freezing the engineer out of their own system. These were not trivial vulnerabilities: altering PLCs could—as Stuxnet’s action at Natanz showed—be used to cause physical damage in the real world. The discovery confirmed Beresford’s suspicions: it was not as hard as it looked. Nation-states were not the only ones that could wreak havoc on physical systems with malicious exploits. He realized that “it’s not just the 31

spooks who have these capabilities” but “[a]verage guys sitting in their basements can pull [it] off” too. It was a discovery that had potentially explosive consequences. In the wrong hands, the vulnerabilities could be used to attack infrastructures operating S7 PLCs. Thanks to Siemens’ wide reach; there would likely be no shortage of potential targets. Beresford discovered something interesting. Now, a new question emerged: What was he going to do with it?

The Importance of Disclosure: Zero-Day Vulnerabilities and Zero-Day Malware32 Beresford confronted a dilemma familiar to independent security researchers: What should he do with his newly discovered vulnerability? Should he release the discovery to the public? Should he report the problem to the vendor, while keeping the public—including users of Siemens flawed equipment—in the dark? Could he sell the vulnerability? If so, who might buy it and what would they be willing to pay?

33

These questions are not new. The computer industry, policymakers, and independent security researchers have debated the issue of vulnerability disclosure for years. The ethical and practical issues are complex. 34

Identifying vulnerabilities improves what are invariably flawed pieces of software and hardware. Even when a 35

vendor pours resources into testing and development, unknown flaws will still remain. Ferreting out unknown flaws allows vendors to introduce new patches and updates that enhance security. To this end, independent security researchers—such as Beresford—provide a valuable service: They help improve the overall quality of computer security. Yet, discovering new vulnerabilities is not an unqualified good. In the wrong hands, a zero-day 36

vulnerability can be used to cause significant harm. Zero-day vulnerabilities are key components of new forms of 37

malware. For an enterprising attacker, vulnerabilities that are not yet recognized by the vendor, anti-virus programs, or the general public are a goldmine. Malicious exploits built upon zero-day vulnerabilities, “zero-day malware,” are prized by attackers and loathed by those concerned with security. Zero-day malware is new and difficult to detect. Anti-virus programs, intrusion detection systems, and other defensive measures, for the most HKS Case Program

3 of 11

Case Number 2029.0

part, cannot defend what they cannot recognize. These forms of malware slip through standard defenses. Until the vulnerability is publicly identified and disclosed, zero-day malware can operate with near-impunity.

38

Dillon Beresford was in an incredibly powerful, and potentially lucrative, position. Beresford—or a person that finds him or herself in a similar position—has four general options when it comes to disclosing a zero-day vulnerability: (1) provide the vulnerability to the vendor (Siemens); (2) release the flaw to the public at large; (3) pass the vulnerability to an intermediary, such as the Department of Homeland Security’s ICS-CERT, which can coordinate the release of the vulnerability with the vendor and the public; or, (4) sell the vulnerability to an 39

interested party. Each of these options offers a different way of making the vulnerability available to the vendor and the public. Beresford’s decision would therefore determine whether or not the discovery would be used to improve or undermine security.

40

If he disclosed, in some fashion, the vulnerability to the vendor and the public, patches

and fixes that address the flaw could be introduced. The possibility of creating a new piece of zero-day malware 41

would quickly evaporate. However, if he chose to share his wares exclusively with malicious actors (or use it for his own malicious ends), the possibility that a new piece of zero-day malware would be created could become a reality. The stakes were high. Beresford’s decision would make all the difference. Confronted with these options, what should a researcher like Beresford do? How do different modes of disclosure alternately benefit the researcher, vendor, and the public?

Limited Disclosure: Contacting Siemens Beresford could report his findings directly to Siemens, while keeping the details of his discovery 42

otherwise secret. Under this approach, which is sometimes referred to as “limited disclosure,” Siemens would be privy to the zero-day flaws, while the larger public, including users of Siemens equipment, would remain in the dark. Approaching Siemens directly, would have obvious benefits for both the vendor and the public. Siemens 43

could create a new patch or update that fixes the flaw before new malicious exploits can be written. Since the vulnerability is not public knowledge, malicious actors cannot take advantage of the vulnerability for one very obvious reason: they do not know it exists. Under this scenario, Siemens could release a new patch or update when it is ready and convenient.

44

Since knowledge of the vulnerability is not widely distributed—as far as

Beresford and Siemens are aware, they are the only two parties privy to the flaw—Siemens can create a patch without worrying that the vulnerability is being exploited in the wild.

45

Yet, there are serious limitations to this approach as well. What would Beresford receive for his efforts? Beresford would likely receive no direct compensation for his work. Siemens, like most vendors, does not purchase vulnerabilities. Beresford might receive some public credit from Siemens once they release a patch dealing with the vulnerabilities that he discovered, but there is no guarantee. Revealing the flaw exclusively to Siemens could also place the public at a certain amount of risk. It is impossible to guarantee that knowledge of the vulnerability is actually restricted to Beresford and Siemens. There HKS Case Program

4 of 11

Case Number 2029.0

is no way to know whether or not someone else—a third party—has independently discovered the same 46

vulnerability. Someone else may independently find the same vulnerability before Siemens releases a patch. As 47

long as the vulnerability exists without an available fix, it is possible that it is being maliciously exploited. No matter how careful Beresford and Siemens are in controlling information about the zero-day vulnerability, someone else may have stumbled upon the same discovery and crafted malware to take advantage of the flaw. Users of the flawed Siemens products are more or less defenseless. If Beresford decides to release the information 48

only to the vendor, end users will remain vulnerable until Siemens decides to release a fix. Compounding matters, since Beresford has provided Siemens with the information more or less in secret, there might not be a strong incentive for Siemens to release a timely patch. The ostensible secrecy might, in some cases, allow vendors to ignore vulnerabilities at their discretion.

49

Beresford might be committed to “doing the right thing.” But what does that mean in this case? Approaching Siemens directly might not offer the best option. Rather than discreetly providing the vulnerability to an audience of one (Siemens), Beresford could release the vulnerability publicly to the widest possible audience.

Full Disclosure: Fame or Infamy? Beresford could pursue a different path. Rather than releasing the vulnerability directly and exclusively to Siemens, he could release the vulnerability widely to the public. Through public release, or “full disclosure” as it is known, Siemens, end users, curious observers, and malicious actors, all would receive information about the flaw 50

51

at the same time. Full disclosure is a great shortcut to fame. Releasing a new zero-day is sure to make a splash. Computer security conferences, such as DEF CON or Black Hat, as well as the Bugtraq mailing list, provide high52

profile venues for the release of a new zero-day. Beresford would instantly become well known within hacker and computer security circles. He was sitting on something that was definitely interesting. In the wake of Stuxnet, a series of new zero-days impacting Siemens’ PLCs would be sure to turn a lot of heads. With a simple post to a well-read mailing list, or short presentation at a conference, Beresford could potentially catapult himself into the national conversation about cybersecurity. Releasing a zero-day publicly—full disclosure—however, is not without controversy. Full disclosure has 53

54

been debated among security researchers for years. Full disclosure might result in some positives. Releasing the vulnerability to the public will push Siemens to quickly address the new flaw: they cannot afford to ignore a 55

publicly released vulnerability. If Beresford broadcasts the flaw far and wide, Siemens will have to work, develop and release a patch as soon as possible. At the same time, public release will allow users of Siemens’ equipment to take proactive steps to protect their systems. Someone like Beresford could further justify full disclosure as a type of corrective to the spotty developmental practices of Siemens. As a type of naming and shaming, full disclosure might, in the long run, push vendors to release products that are better designed and more secure.

56

But, full disclosure might also do a significant amount of harm and turn Beresford into a pariah. It could bring him an enormous amount of attention, but ruin his career. Releasing a new vulnerability into the wild, without notifying the vendor first, is seen by many within computer security circles as an irresponsible and reckless 57

act. Siemens would certainly be furious. Making a new vulnerability available to the public, allows malicious actors to craft new dangerous exploits before security updates can be created and distributed. Without a headHKS Case Program

5 of 11

Case Number 2029.0

start, Siemens will have to move fast to play catch-up. Professionally, Beresford’s move could backfire. As a researcher at NSS Labs, his career, to some degree, rests on the ability to work in the service of corporate clients. Becoming branded as a “loose cannon” that flaunts his ability to violate the wishes of hardware and software developers, might not be a sound career move. Releasing a zero-day publicly might make him a hero in some circles, but, to his employer and potential clients it might make him a liability.

Responsible Disclosure: A Middle Path Beresford could pursue a middle path between the extremes of limited disclosure and full disclosure. Under what has become known as “responsible disclosure,” Beresford could turn over the vulnerability to Siemens in secret, as under limited disclosure, but with an important caveat: After a specified time-limit—45 days is common, though there is not a universal standard—Beresford would then publicly release the vulnerability. Patch 58

or no patch, after the passage of the deadline, Beresford will publish the vulnerability. A number of third-party 59

intermediaries have been created to facilitate responsible disclosure. In particular, Beresford could report his vulnerabilities to ICS-CERT, an intermediary run by the Department of Homeland Security (DHS) devoted to industrial control system security. ICS-CERT accepts vulnerabilities form researchers and provides technical details 60

to the relevant vendor exclusively for 45 days. After 45 days, ICS-CERT makes public the vulnerability

61

This third path—responsible disclosure—might be attractive to Beresford. It avoids the free-for-all of full disclosure: Vendors are given a head start to work on creating a patch before the flaw is public knowledge and 62

open to widespread exploitation. At the same time, vendors cannot ignore reported vulnerabilities indefinitely. 63

The clock is ticking. After the imposed deadline passes, the information will become public. Further, responsible disclosure ensures that Beresford will receive public credit for his discovery. ICS-CERT, other intermediaries, and most vendors offer public acknowledgment of researchers that submit vulnerabilities through a process of responsible disclosure. Responsible disclosure will help Beresford build a reputation with vendors and potential 64

clients as a reasonable partner. These reputational effects are not necessarily insignificant. For someone like Beresford that works for a security company that services corporate clients, the recognition of both his unique skill and willingness to engage with industry is an asset. Security researchers often sit, at times uneasily, between the world of hackers and the world of corporate security. Responsible disclosure offers a way for researchers like Beresford to straddle these two worlds: It allows them to receive prized recognition of their elite skills within the community of hackers, while signaling to corporate players that have lucrative security contracts to fill that they are in fact “responsible” actors.

Commercialization: The Market for Zero-Days Someone in Dillon Beresford’s position does not necessarily have to give away his discovery. Another possibility hangs in the air: He could sell the vulnerabilities. Why give away your discovery, when you can sell it? Zero-days are valuable: Recent reports in the popular press have indicated that particularly prized zero-days can 65

fetch six-figures. Not bad for a few months of work. Siemens is an unlikely buyer. As a matter of corporate policy, most vendors refuse to buy vulnerabilities.

66

Others, however, might be more than happy to do business with Beresford. There is a growing market for zeroHKS Case Program

6 of 11

Case Number 2029.0

days. Security companies, governments, and criminals are all active in the market for vulnerabilities. Security companies, such as Zero Day Initiative (ZDI) and VeriSign, buy zero-days from researchers and use the acquired 67

flaws as part of customized security packages that they sell to clients. By purchasing a new zero-day, a company like ZDI can promise to offer its clients protection from flaws of which other competing security companies cannot recognize. For these companies, buying a zero-day differentiates their services from other security vendors. For an operator using Siemens PLCs, a security firm that can boast exclusive access to cutting-edge, critical security flaws might be someone with whom your would like to partner. Some companies, like ZDI, eventually turn over the vulnerability to the affected vendor (after providing their clients with exclusive protection for a few months), while others keep the purchased information in-house indefinitely, leaving those that do not purchase their services exposed. The real money however is not in defense, but in offense. Others buy zero-days for active exploitation— for offensive operations. Charlie Miller, a well-known hacker and former employee at the National Security 68

Agency, recently remarked that the “only people paying are on offense.” As Stuxnet illustrated nation-states use zero-days to create new and undetectable cyberweapons. Unsurprisingly, governments and defense contractors— including major players such as General Dynamics, Raytheon, and Northrop Grumman—are considered to be the 69

largest purchasers of zero-days. The Siemens vulnerabilities might be particularly attractive to a nation-state looking to expand its portfolio of offensive cyberweapons. The vulnerabilities Beresford found impact systems that control physical processes across the globe. Attacking these systems might have significant strategic value for a military. Beresford could also turn to the criminal underground. Organized crime is, of course, a significant player in cybercrime. The Siemens vulnerabilities might not have the same appeal as a vulnerability that impacts widely used general software, like Microsoft Internet Explorer or Java, and can be used to get access to coveted consumer 70

data. But, criminals might find that the possibility of threatening utilities—demanding payment in exchange for leaving their control systems alone—enticing.

71

For Beresford, selling his wares might be more difficult than it appears. The market for zero-days is not transparent: much of the buying and selling happens in informal settings and underground. Connecting with a buyer and, more importantly, determining what constitutes a “fair” price for the vulnerabilities are a challenge. Generally, the value of vulnerability is shaped by a number of factors—such as how widely used the software is 72

and the reliability of exploits taking advantage of the vulnerability. There is a pressure to sell fast. Zero-days have 73

a shelf life: Once the vulnerability is public, their value shrinks to $0. If someone else discovers and publicly releases the vulnerability before Beresford can close the sale, he would be left with a worthless product. Additionally, as new versions of software are released, there is a risk that the uncovered flaw will disappear. What if Siemens releases a new upgrade for their PLCs and the vulnerability disappears while Beresford is working to negotiate a sale? His once valuable discovery would curdle. If he is going to sell, he ought to do it quickly.

HKS Case Program

7 of 11

Case Number 2029.0

* The author wishes to thank Professor Venky Narayanamurti, Professor Greg Morrisett, Professor Jim Waldo, and Katie Moussouris for their comments and suggestions. This work is funded, in part, by the Office of Naval Research under award number N00014-09-1-0597. Any opinions, findings and conclusions or recommendations expressed in this publication are those of the author and do not necessarily reflect the views of the Office of Naval Research. 1 Robert O’Harrow Jr., “Everyday Machines Vulnerable to Hacking,” Washington Post, June 4, 2012. 2 Brian Krebs, “Experts Warn of New Windows Shortcut Flaw,” Krebs on Security: In-Depth Security News and Investigation, July 10, 2010. For an overview of the early reporting of Stuxnet, see Michael J. Gross, “A Declaration of Cyber-War,” Vanity Fair, April 2011. For comprehensive accounts of the creation, deployment, and operation of Stuxnet, sees David E. Sanger, Confront and Conceal: Obama’s Secret Wars and Surprising Use of American Power, New York: Crown, 2012, and Symantec, W32.Stuxnet Dossier, Version 1.4, February 2011. 3 Gross, “A Declaration of Cyber-War.”; Mark Clayton, “Stuxnet Spyware Targets Industrial Facilities, via USB Memory Stick,” The Christian Science Monitor, July 23, 2010. 4 For an overview, see Symantec, Internet Security Threat Report: 2012, Vol. 18. 2013. 5 Sanger, “Iran Fights Malware Attacking Computers.”; Mark Clayton, “Stuxnet Malware is ‘Weapon’ Out to Destroy…Iran’s Bushehr Nuclear Plant?” The Christian Science Monitor, September 21, 2010; Jonathan Fildes, “Stuxnet Worm ‘Targeted HighValue Iranian Assets,” BBC News, September 23, 2010; Josh Halliday, “Stuxnet Worm is the ‘Work of a National Government Agency,” The Guardian, September 24, 2010. 6 Clayton, “Stuxnet Malware is ‘Weapon’ Out to Destroy…Iran’s Bushehr Nuclear Plant?”; Reuters, “UPDATE 2: Cyber Attack Appears to Target Iran-Tech Firms,” Reuters, September 24, 2010; John Markoff, “A Silent Attack, but Not a Subtle One,” The New York Times, September 26, 2010; Richard A. Falkenrath, “From Bullets to Megabytes,” The New York Times, January 27, 2011; Holger Stark, “Stuxnet Virus Opens New Era of Cyber War,” Spiegel Online, August 8, 2011. 7 See, Sanger, Confront and Conceal. 8 Symantec, W32.Stuxnet Dossier; Carey Nachenberg, “A Forensic Dissection of Stuxnet,” Center for International Security and Cooperation, Stanford University, April 23, 2012; Riva Richmond, “Malicious Software Program Attacks Industry,” The New York Times, September 25, 2010. 9 National Institute of Standards and Technology, Guide to Industrial Control Systems (ICS) Security, Special Publication #800-82, June 2011. 10 Stuxnet targeted linked elements of the industrial control system. Specifically, it targeted Siemens’ SIMATIC WinCC and Step 7 PLCs. Tofino Security, “How Stuxnet Spreads,” Version 1, February 22, 2011. 11 Sanger, Confront and Conceal; Symantec, W32.Stuxnet Dossier, Nachenberg, “A Forensic Dissection of Stuxnet.” 12 David E. Sanger, “Obama Order Sped Up Wave of Cyberattacks Against Iran,” The New York Times, June 1, 2012. 13 Sanger, “Obama Order Sped Up Wave of Cyberattacks Against Iran.” 14 See, Kaspersky Lab, “Kaspersky Lab Provides Insights on Stuxnet Worm,” Virus News, Kaspersky Lab, September 24, 2010; Fildes, “Stuxnet Worm ‘Targeted High-Value Iranian Assets.’”; Halliday, “Stuxnet Worm is the ‘Work of a National Government Agency.’”; Sanger, “Iran Fights Malware Attacking Computers.”; Richmond, “Malicious Software Program Attacks Industry.” 15 Gross, “A Declaration of Cyber-War.” See also, Andy Greenberg, “Shopping for Zero-Days: A Price List for Hackers’ Secret Software Exploits,” Forbes, March 23, 2012; Tom Simonite, “Welcome to the Malware-Industrial Complex,” MIT Technology Review, February 13, 2013. 16 Nachenberg, “A Forensic Dissection of Stuxnet.”; “Stuxnet Worm ‘Targeted High-Value Iranian Assets.’” 17 Symantec, W32.Stuxnet Dossier; Kaspersky Lab, “Kaspersky Lab Provides Insights on Stuxnet Worm.”; “Stuxnet Worm ‘Targeted High-Value Iranian Assets.’”; Nachenberg, “A Forensic Dissection of Stuxnet.” 18 Kaspersky Lab, “Kaspersky Lab Provides Insights on Stuxnet Worm.”; Sanger, Confront and Conceal. 19 O’Harrow Jr., “Everyday Machines Vulnerable to Hacking.” 20 Ibid. 21 Many of the details of Beresford’s work are drawn from the detailed account he provided in: Digital Bond, “Dillon Beresford on Siemens Vulns and Rick Kaun on CIP,” Interview with Dillon Beresford, This Month In Control System Security Podcast, June, 2011; Dillon Beresford, “Siemens,” SCADASEC [Online Mailing List], May 23, 2011. 22 O’Harrow Jr., “Everyday Machines Vulnerable to Hacking.” 23 Quoted in Ellen Messmer, “Siemens’ ‘Damage Control’ Response to SCADA Bud Frustrates Researcher,” Network World, May 23, 2011; Beresford, “Siemens,” May 23, 2011. 24 O’Harrow Jr., “Everyday Machines Vulnerable to Hacking.” 25 The New Zealand Herald, “Science Fiction-Style Sabotage a Fear in New Hacks,” The New Zealand Herald, October 25, 2011. 26 Beresford initially focused on S7-1200s, but the vulnerabilities also impacted S7-300s and S7-400s. Beresford, “Exploiting Siemens Simatic S7 PLCs.”; Digital Bond, “Dillon Beresford on Siemens Vulns and Rick Kaun on CIP.” 27 Siemens, The Company-Siemens 2013, 2013, p. 12. 28 Siemens, “S7-1200 Product Information.” HKS Case Program

8 of 11

Case Number 2029.0

29

Robert McMillan, “After Delay, Hacker to Show Flaws in Siemens Industrial Gear,” IDG News Service, IT World, June 6, 2011. Dillon Beresford, “Exploiting Siemens Simatic S7 PLCs,” Black Hat USA, July 8, 2011. 31 Robert McMillan, “A Power Plant Hack that Anybody Could Use,” IDG News Service, PC World, August 5, 2011. 32 The discussion that follows is informed by the significant and growing literature on vulnerability disclosure. See Adam Hahn and Manimaran Govindarasu, “Cyber Vulnerability Disclosure Policies for the Smart Grid,” IEEE: Power and Energy Society General Meeting, 2012; Andrea Matwyshyn, Ang Cui, Angelos D. Keromytis, and Salvatore J. Stolfo, “Ethics in Security Vulnerability Research,” IEEE: Security & Privacy, 8.2 (2010): 67-72; Andy Ozmet, “The Likelihood of Vulnerability Rediscovery and the Social Utility of Vulnerability Hunting,” Working Paper, Computer Laboratory, University of Cambridge, Version 1.1; Ashish Arora and Rahul Teland, “Economics of Software Vulnerability Disclosure,” IEEE: Security & Privacy, 3.1 (2005): 20-25; Bruce Schneier, “Full Disclosure,” Crypto-Gram Newsletter, November 15, 2001; Bruce Schneier, “Full Disclosure and the Window of Exposure,” Crypto-Gram Newsletter, September 15, 2000; Bruce Schneier, “Recent Developments in Full Disclosure,” Schneier on Security, December 6, 2011; Dmitri Nizovtsev and Marie Thursby, “Economic Analysis of Incentives to Disclose Software Vulnerabilities,” Washburn University School of Business Working Paper Series, #41, April 2005; Ethan Preston and John Lofton, “Computer Security Publications: Information Economics, Shifting Liability and the First Amendment,” Whittier Law Review (2002-2003): 71-142; Eric Roescorla, “Is Finding Security Holes a Good Idea?” Available online: http://www.dtc.umn.edu/weis2004/rescorla.pdf; Hasan Cavusoglu, Huseyin Cavusoglu, and Srinivasan Raghunathan, “Efficiency of Vulnerability Disclosure Mechanisms to Disseminate Vulnerability Knowledge,” IEEE: Transactions in Software Engineering, 33.3 (2007): 171-185; Marcus J. Ranum, “The Vulnerability Disclosure Game: Are We More Secure?” CSO: Security and Risk, March 1, 2008; Miles McQueen, Jason L. Wright, and Lawrence Wellman, “Are Vulnerability Disclosure Deadlines Justified?” 2011, Third International Workshop on Security Measurements and Metrics, 2011; Scott Berinato, “Software Vulnerability Disclosure: The Chilling Effect,” CSO: Security and Risk, January 1, 2007. 33 Please note: the following discussion is a stylized examination of what someone in Beresford’s position may consider. It does not reflect and is not intended to represent the deliberative process that Beresford actually went through. 34 See, Schneier, “Full Disclosure and the Window of Exposure.” 35 See, Cavusoglu, Cavusoglu, and Raghunathan, “Efficiency of Vulnerability Disclosure Mechanisms to Disseminate Vulnerability Knowledge.”; Daniel J. Ryan, “Two Views on Security and Software Liability: Let the Legal System Decide,” IEEE: Security & Privacy, 1.1 (2003): 70-72; Ranum, “The Vulnerability Disclosure Game: Are We More Secure?”; Microsoft, Trustworthy Computing, Software Vulnerability Management at Microsoft, July 2010; Schneier, “Full Disclosure.”; Schneier, “Full Disclosure and the Window of Exposure.” 36 See Symantec, Internet Security Threat Report: 2013, April 2013. 37 Ibid. 38 Ibid. 39 These options follow the three basic models of vulnerability disclosure that are most often discussed in the literature: (1) limited disclosure; (2) full disclosure; (3) responsible or coordinated disclosure. These models are occasionally discussed under slightly different headings. Commercialization offers a fourth option. For an overview, see supra note 32. 40 The moment of disclosure is a key “control point” in the life-cycle of zero-day malware. On the notion of control points, see David Clark, “Control Point Analysis,” TRPC, September 10, 2012, available online at . 41 See, Arora and Teland, “Economics of Software Vulnerability Disclosure.”; Hahn and Govindarasu, “Cyber Vulnerability Disclosure Policies for the Smart Grid.”; Schneier, “Full Disclosure and the Window of Exposure.” 42 See supra note 32. 43 See supra note 32. 44 In many cases, creating and deploying a new patch is not an insignificant task. Patches must be tested to ensure that they do not inadvertently create new compatibility issues or introduce new vulnerabilities. Additionally, releasing patches on a predictable schedule as part of regularly occurring updates, makes it more likely that end users will actually install important new updates. See Cavusoglu, Cavusoglu, and Raghunathan, “Efficiency of Vulnerability Disclosure Mechanisms to Disseminate Vulnerability Knowledge.”; Hilary K. Browne, William A. Arbaugh, John McHugh, and William L. Fithen, “A Trend Analysis of Exploitations,” November 9, 2000, available online at ; Microsoft, Trustworthy Computing, Software Vulnerability Management at Microsoft; Roescorla, “Is Finding Security Holes a Good Idea?”; William A. Arbaugh, William L. Fithen, and John McHugh, “Windows of Vulnerability: A Case Study Approach,” Computer 33.12 (2000): 52-59. 45 Cavusoglu, Cavusoglu, and Raghunathan, “Efficiency of Vulnerability Disclosure Mechanisms to Disseminate Vulnerability Knowledge.” 46 Ozmet, “The Likelihood of Vulnerability Rediscovery and the Social Utility of Vulnerability Hunting.”; Schneier, “Full Disclosure and the Window of Exposure.” 30

HKS Case Program

9 of 11

Case Number 2029.0

47

See, McQueen, Wright, and Wellman, “Are Vulnerability Disclosure Deadlines Justified?”; Schneier, “Full Disclosure and the Window of Exposure.”; Bruce Schneier, “The Vulnerabilities Market and the Future of Security,” Crypto-Gram Newsletter, June 15, 2012. 48 McQueen, Wright, and Wellman, “Are Vulnerability Disclosure Deadlines Justified?”; Nizovtsev and Thursby, “Economic Analysis of Incentives to Disclose Software Vulnerabilities.” 49 During the 1990s, security researchers became increasingly frustrated by what they saw as the indifference of vendors to reported vulnerabilities. Software vendors, it was charged, fail to address—or at least address in a timely manner— vulnerabilities reported by independent researchers. Since vendors are generally shielded from liability for vulnerabilities by end user license agreements, some of the incentives to fix flawed products that are traditionally found in other commercial markets are absent. See Matwyshyn et al., “Ethics in Security Vulnerability Research.”; Preston and Lofton, “Computer Security Publications: Information Economics, Shifting Liability and the First Amendment.”; Rahul Telang and Sunil Wattal, “An Empirical Analysis of the Impact of Software Vulnerability Announcements on Firm Stock Price,” IEEE Transactions on Software Engineering, 22.8 (2007): 544-557; Ryan, “Two Views on Security and Software Liability: Let the Legal System Decide.”; Schneier, “Full Disclosure and the Window of Exposure.”; Schneier, “Full Disclosure.” 50 Full disclosure is simple: the researcher releases the new zero-day publicly without notifying the vendor (or anyone else) beforehand. The vulnerability is available to all—the public and the vendor—simultaneously. See supra note 32. 51 Schneier, “Full Disclosure.” 52 McKinney, “Vulnerability Bazaar.” 53 Preston and Lofton, “Computer Security Publications: Information Economics, Shifting Liability and the First Amendment.”; Schneier, “Full Disclosure and the Window of Exposure.”; Schneier, “Full Disclosure.” 54 Members of the computer security community began advocating for full disclosure in response to the perceived failings of limited disclosure. Preston and Lofton, “Computer Security Publications: Information Economics, Shifting Liability and the First Amendment.”; Schneier, “Full Disclosure and the Window of Exposure.”; Schneier, “Full Disclosure.” 55 McQueen, Wright, and Wellman, “Are Vulnerability Disclosure Deadlines Justified?”; Schneier, “Full Disclosure.” 56 See: Matwyshyn et al., “Ethics in Security Vulnerability Research.”; Preston and Lofton, “Computer Security Publications: Information Economics, Shifting Liability and the First Amendment.”; Ryan, “Two Views on Security and Software Liability: Let the Legal System Decide.”; Schneier, “Full Disclosure.”; Telang and Wattal, “An Empirical Analysis of the Impact of Software Vulnerability Announcements on Firm Stock Price.” 57 See Schneier, “Full Disclosure.”; Ranum, “The Vulnerability Disclosure Game: Are We More Secure?” 58 See supra note 32. 59 Hahn and Govindarasu, “Cyber Vulnerability Disclosure Policies for the Smart Grid.”; Kannan and Telang, “An Economic Analysis of Market for Software Vulnerabilities.”; McQueen, Wright, and Wellman, “Are Vulnerability Disclosure Deadlines Justified?” 60 ICS-CERT, “ICS-CERT Vulnerability Disclosure Policy,” available online at < https://ics-cert.us-cert.gov/ICS-CERT-VulnerabilityDisclosure-Policy>. 61 ICS-CERT is not unique. Many other intermediaries have arisen to facilitate responsible disclosure. See Hahn and Govindarasu, “Cyber Vulnerability Disclosure Policies for the Smart Grid.”; Kannan and Telang, “An Economic Analysis of Market for Software Vulnerabilities.”; McQueen, Wright, and Wellman, “Are Vulnerability Disclosure Deadlines Justified?” 62 Unsurprisingly, Vendors actively embrace and promote responsible disclosure. For example, see Microsoft, Trustworthy Computing, Software Vulnerability Management at Microsoft. 63 McQueen, Wright, and Wellman, “Are Vulnerability Disclosure Deadlines Justified?”; Schneier, “Full Disclosure.” 64 Intermediaries, such as ICS-CERT, and vendors publicly acknowledge the contributions of researchers that practice responsible disclosure. 65 See Andy Greenberg, “Meet the Hackers Who Sell Spies the Tools to Crack your PC (And Get Paid Six-Figure Fees),” Forbes, March 21, 2012; Charlie Miller, “The Legitimate Vulnerability Market: Inside the Secretive World of 0-day Exploit Sales,” Workshop on the Economics of Information Security, 2007, available online at: ; David McKinney, “Vulnerability Bazaar,” IEEE: Security and Privacy, 5.6 (2007): 69-73; Joseph Menn, “U.S. Cyberwar Strategy Stokes Fear of Blowback,” Reuters, May 10, 2013; Karthik Kannan and Rahul Telang, “An Economic Analysis of Market for Software Vulnerabilities,” May 3, 2004; Ryan Gallagher, “The Secretive Hacker Market for Software Flaws,” Slate, January 16, 2013; Tom Simonite, “Welcome to the Malware-Industrial Complex,” MIT Technology Review, February 13, 2013; Schneier, “The Vulnerabilities Market and the Future of Security.” 66 Some vendors offer bug bounty programs which provide a nominal fee—a few thousand dollars at most in most cases—and public recognition to researchers that uncover and submit new zero-days. Siemens is considering introducing a bug bounty program, but currently does not offer any compensation in exchange for a vulnerability. Kelly Higgins, “SCADA Security 2.0,” Dark Reading, January 24, 2013, available online at: . HKS Case Program

10 of 11

Case Number 2029.0

67

McKinney, “Vulnerability Bazaar.” Qtd. in Menn, “U.S. Cyberwar Strategy Stokes Fear of Blowback.” 69 Greenberg, “Meet the Hackers Who Sell Spies the Tools to Crack your PC (And Get Paid Six-Figure Fees).”; Schneier, “The Vulnerabilities Market and the Future of Security.” 70 Greenberg, “Shopping for Zero-Days: A Price List for Hackers’ Secret Software Exploits.”; Miller, “The Legitimate Vulnerability Market: Inside the Secretive World of 0-day Exploit Sales.” 71 See Joel Brenner, America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime, and Warfare, New York: Penguin, 2011. 72 See Miller, “The Legitimate Vulnerability Market: Inside the Secretive World of 0-day Exploit Sales.” 73 McKinney, “Vulnerability Bazaar.”; Miller, “The Legitimate Vulnerability Market: Inside the Secretive World of 0-day Exploit Sales.”; Schneier, “The Vulnerabilities Market and the Future of Security.” 68

HKS Case Program

11 of 11

Case Number 2029.0

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF