Last Week in Tech Law & Policy, Vol. 31: Sponsored and Compelled Hacking, Government Edition

(by Colter Donahue, Colorado Law 3L)

Should government agencies possess, compel, or sponsor hacking and backdoors? A backdoor is a method of bypassing the normal authentication system of a website, messaging service, or other means of electronic communications.

Privacy and encryption advocates point out that the tools created or vulnerabilities exploited by backdoors pose a privacy risk. The vulnerabilities are not not limited to exploit by U.S. agencies like the FBI and NSA; bad actors and other nations can use them too. Hacking tools don’t always stay secret; once exposed, potential damage may be measured on a global scale. But what happens when law enforcement needs access for investigatory purposes? The following post will look at a recent example and the balance of competing interests.

In the weeks and months following the San Bernardino attack at the Inland Regional Center, the FBI engaged in a court battle with Apple, Inc.—the maker of an iPhone 5C involved in the FBI investigation. On February 16, 2016, Magistrate Judge Sheri Pym (C.D. CA) ordered Apple to create and load a new operating system that would disable the iPhone’s data self-destruct and escalating delay features and allow automatic passcode entry.

The same day Apple CEO Tim Cook released a message to Apple customers in opposition to the court order:

Building a version that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The government relied on the Communications Assistance for Law Enforcement Act (CALEA) to require built-in surveillance capabilities and the All Writs Act (AWA), which allows courts to issue orders “necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” By the end of March, in the midst of the court battle, the FBI obtained a hack to access the phone and requests the order be vacated.

We have since found out that, although the FBI can use the iPhone hack , it doesn’t know how it actually works. The hack, purchased for approximately 1 million dollars, works only on iPhone 5C devices running iOS 9.

In light of the wide-ranging NSA spying operations exposed by Edward Snowden, should government agencies have access to such a tool? What happens if it gets out? Just last month, similar top-secret NSA exploits were leaked online.

There is undoubtedly a privacy risk in creating or using vulnerabilities in technology. However, situations may exist where access is beneficial to society.

While many backed Apple CEO Tim Cook’s position, prominent figures and politicians took another stance. For example, Republican Presidential nominee Donald Trump took to Twitter:

If Apple doesn’t give info to authorities on the terrorists I’ll only be using Samsung until they [Apple] give info.

Setting aside that Samsung does not make the operating system used on their phones, the number of likes indicates Mr. Trump is not alone in taking this position. Democrat nominee Hillary Clinton, responding to a question about the encryption used in the France terrorist attacks, took a more neutral stance:

[I]t doesn’t do anybody any good if terrorists can move toward encrypted communication that no law enforcement agency can break into before or after.

(To watch this portion of the December Democratic debate aired on ABC click here.)

Senator Richard Burr (R-NC) also voiced concerns:

While the national security implications of this situation are significant, the outcome of this dispute will also have a drastic  effect on criminal cases . . . Murderers, pedophiles, drug dealers and the others are already using this technology to cover their tracks.

There exists a societal interest in finding, investigate, and prosecute individuals that commit crimes. Technology, however, is changing the nature of criminal investigations. Criminals no longer keep evidence in plain sight and unprotected, and gone are the days of physical ledgers, notebooks, and letters.

Relevant evidence may not even be stored within a court’s jurisdiction. A proposed change to Rule 41 of the Federal Rules of Criminal Procedure would allow courts to “authorize warrants to remotely access, search, seize, or copy data on computers, wherever in the world they are located,” a move criticized by public interest organizations. If implemented, the new rule would increase government hacking.

Another proposal went even further, pushing into the domain of the Apple v. FBI court battle described earlier in this post. Senators Feinstein (D-CA) and Burr released a discussion draft of the Senate Intelligence Committee’s bill later titled the “Compliance with Court Orders Act of 2016.” The bill’s purpose is “[t]o require the provision of data in an intelligible format to a government to a court order.” Andy Greenberg, a senior writer for WIRED, stated that the legislation would make “user-controlled encryption” illegal, effectively allowing the government to bypass authentication by the owner of the data. If this bill had passed, Apple would no longer be able to use a system where the device data is encrypted and the only party that has the password is the owner.

While the push for legislation faded, questions remain. How do we weigh the competing interests in privacy, data security, and law enforcement? If we create vulnerabilities and backdoors, law enforcement may have access when appropriate, but methods and vulnerabilities, once known, may become available to everyone. Encryption and privacy advocates may have escaped the Compliance with Court Orders Act for the time being but the change to Rule 41 is pending Congressional review. The FBI may continue to pursue court orders based on the CALEA and AWA. Companies like Apple will seek continue to encrypt devices and data. What side is right? Is there a balance to be struck? How do we decide what interests are most important? Should the government possess tools or compel others to create backdoors?  Is the public discourse really as polarized as it seems?

One Reply to “Last Week in Tech Law & Policy, Vol. 31: Sponsored and Compelled Hacking, Government Edition”

Comments are closed.