Back to News & Commentary

Government Hacking Makes Everyone Less Safe

Surveillance Computers
Surveillance Computers
Danny Yadron,
Legal Intern,
ACLU
Jennifer Stisa Granick,
Surveillance and Cybersecurity Counsel, ACLU Speech, Privacy, and Technology Project
Share This Page
September 13, 2018

Last week, the Justice Department criminal charges against a North Korean operative for a malware attack that endangered hospital systems and crippled the computers of businesses, governments, and individuals around the world. Americans might be surprised to learn that the software used for this 2017 attack — known as “WannaCry” — was based on a hacking tool created by the U.S. government itself.

The NSA developed the tool for its own hacking operations and, inevitably, it leaked out. This incident raises questions about the wisdom of allowing the U.S. government — and law enforcement agencies in particular — to deploy hacking as a tool of surveillance.

Government hacking proposals have evolved in the context of the FBI’s “Going Dark” public relations campaign, which claims that the growing use of encryption will eviscerate the FBI’s ability to eavesdrop on criminals. To guard against this, the government says it needs tech companies to compromise customer security by providing “backdoor” access to law enforcement, giving it broad access to private communications and other revealing personal data.

But security experts almost uniformly agree that it is dangerous to design encryption to ensure investigators can have access to everything. Giving the government this power would render encryption software less secure since it would necessarily have a built-in weakness.

As the government vigorously pursues its campaign to force back doors into communications systems and devices, some security experts have proposed an odd compromise in response: That instead of giving the government more expansive backdoor privileges, the government should be allowed to deploy hacker tricks, arguably compromising fewer people’s data in the process.

The thinking goes like this: Because the government would not be allowed to force companies to build insecurities into all modern communications systems, most consumers could maintain their digital privacy. Regulations, moreover, could ensure that the government only hacks people in limited investigations and with probable cause to believe criminal activity is underway.

In a , at Stanford Law School’s Center for Internet and Society (CIS) analyzes the cybersecurity risks of this practice for all internet users — not just law enforcement’s few targeted suspects. (The ACLU’s Jennifer Granick, formerly with CIS, contributed to the report.)

Pfefferkorn argues that government hacking creates an incentive to hoard — rather than disclose and patch — vulnerabilities that criminal hackers could steal or independently discover. She also points out that government hacking cultivates a market for surveillance tools and creates an incentive for the government to push for less secure software and standards.

These concerns are far from theoretical, as multiple government hacking operations have jeopardized the digital security of innocent people. In the case of the WannaCry attacks, in April 2017, a group of hackers released a cache of NSA hacking tools, which included details of previously undisclosed flaws in popular Microsoft software. Microsoft had issued a patch a month earlier — after the NSA noticed the tools were stolen but before the hackers released them to the public. Nevertheless, too many users — as is often the case — did not or could not quickly install it.

The following month, a team allegedly working for the North Korean government used the software flaw to launch a global ransomware attack that, as Pfefferkorn writes, “infected such crucial systems as hospitals, power companies, shipping, and banking, endangering human life as well as economic activity.” Microsoft, rightfully, was . The NSA had kept the vulnerability secret rather than giving the company and its customers more time to update the software.

While targeted government hacking might initially affect fewer people compared to back doors, as the paper concludes, “when the government cannot maintain control over its exploits, hacking looks less like a targeted sniper’s bullet and more like a poorly-aimed bomb, with a broad and indiscriminate blast radius.” Even regulated government hacking poses a security danger to the public.

Learn More About the Issues on This Page