You are here

‘Ransomware attack is why we can’t have security backdoors’

By USA Today (TNS) - May 25,2017 - Last updated at May 25,2017

Photo courtesy of blog.hubspot.com

SAN FRANCISCO — Privacy experts are calling the global ransomware attack that hit 150 countries a prime example of why requiring tech companies to create backdoors into computer programmes is a bad idea, because of the danger those digital keys might be stolen.

“This is a fine example of the difficulty of keeping secrets,” said Cooper Quintin, a staff technologist with the Electronic Frontier Foundation, a digital liberties non-profit based in San Francisco.

The WannaCry ransomware attack hit on Friday and was relatively quickly contained, but not before it infected at least 200,000 computers. The software used a flaw in the code for the Windows operating system that Microsoft and others said was stolen from the National Security Agency or a group believed to be affiliated with it, where it is thought to constitute part of a US cyber-attack arsenal.

The NSA has said it did not create ransomware tools, but has not addressed the issue of whether the original exploitable flaw the ransomware was based on came from stolen NSA cyber tools.

The fact that they appear to have been stolen from a US government-linked group and are now in the public domain has bolstered tech companies’ contention that security backdoors would do more harm than good — simply because these work-arounds risk ending up in criminal hands.

“This attack provides yet another example of why the stockpiling of vulnerabilities by governments is such a problem,” Brad Smith, Microsoft’s chief legal counsel, said in a blog post.

“We have seen vulnerabilities stored by the CIA show up on WikiLeaks, and now this vulnerability stolen from the NSA has affected customers around the world,” he wrote. “Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage.”

Government officials and law enforcement have pressed tech companies to write security keys into computer programmes and operating systems that would aid them in gaining access to the e-mail, networks or smartphones of suspected criminals.

This was the heart of the legal battle waged between Apple and the FBI for 43 days last year as the agency sought Apple’s help to write software that would aid it in breaking into an iPhone used by San Bernardino gunman Syed Rizwan Farook.

A bill proposed last August by Senators Richard Burr and Dianne Feinstein, of the Senate Intelligence Committee, would have required companies to provide technical support to get to encrypted data, but did not specify how that would have to be done.

During its legal battle, Apple argued that it should not be required to write code to allow the FBI to try to get into the iPhone because it was simply too dangerous to do so — once written, it could too easily get hacked, leaked and misused.

In an Op-Ed in the Washington Post at the time, Apple Senior Vice President of Software Engineering Craig Federighi sounded eerily prescient about the damage such stolen tools could wreak.

“Great software has seemingly limitless potential to solve human problems — and it can spread around the world in the blink of an eye. Malicious code moves just as quickly, and when software is created for the wrong reason, it has a huge and growing capacity to harm millions of people,” he wrote.

The ransomware attack is linked to code that started off with a US government group, but ended up in criminal hands. A group called the Shadow Brokers said it stole the Windows vulnerabilities, also called exploits, and posted them online in mid-April, leading Microsoft to post a patch for those flaws.

That was not enough: Because the vulnerability was in older Windows operating systems, one of which Microsoft had stopped supporting, users around the world who had not applied the patch were left vulnerable when a hacker organisation — now thought to be the same one behind the Sony Pictures Entertainment hack — used the flaw to create malware that paralysed computers.

The government’s argument, most often articulated by former FBI director James Comey, was that law enforcement needed to be able to overcome encryption and gain access to computer systems in order to fight law breakers and terrorists who have access to increasingly powerful digital tools to hide their activities.

In March, Comey suggested that the United States and other countries could create a system to allow legal access to tech devices, “a framework, for when government access is appropriate”.

The FBI declined to comment.

It is reasonable to argue that mandating some kind of lawful access mechanisms could add some “unquantifiable additional degree of insecurity” to software and electronic devices, said Adam Klein, a senior fellow at the Centre for New American Security in Washington DC who studies national security and surveillance.

“The real question is whether the social value of solving some additional set of crimes would be less than or greater than the social cost that the risk of key theft would create. I don’t know what the answer to that is, but it’s not a frivolous question.”

Tech companies and privacy advocates fear that there is simply no way for digital keys to any system to be 100 per cent protected.

“Even if you design backdoors with the goal of only allowing access by law enforcement, as a practical matter there’s no way to ensure that the bad actors don’t gain access,” said Neema Singh Guliani, legislative counsel with the American Civil Liberties Union.

What happened this week will not be lost on judges in the future should the government again try to get tech companies to build backdoor access into programmes, said Kristen Eichensehr, a law professor at the University of California at Los Angeles with an expertise in national security law and cybersecurity.

 

“What we’ve seen happen with WannaCry lends credence to that — and certainly and any court is going to take it into account. The government has shown that it itself is persistently incapable of keeping its tools secure,” she said.

up
10 users have voted.


Newsletter

Get top stories and blog posts emailed to you each day.

PDF