Last week’s blog post concerned the ramifications of sponsored and compelled government hacking with the use of backdoor encryption. This week’s post concerns how government hacks of computers using the Tor browser, and whether those hacks are considered a “search” under the Fourth Amendment.
Should government agencies possess, compel, or sponsor hacking and backdoors? A backdoor is a method of bypassing the normal authentication system of a website, messaging service, or other means of electronic communications.
Privacy and encryption advocates point out that the tools created or vulnerabilities exploited by backdoors pose a privacy risk. The vulnerabilities are not not limited to exploit by U.S. agencies like the FBI and NSA; bad actors and other nations can use them too. Hacking tools don’t always stay secret; once exposed, potential damage may be measured on a global scale. But what happens when law enforcement needs access for investigatory purposes? The following post will look at a recent example and the balance of competing interests.
The government intelligence community has long vocally advocated for so-called “backdoors” in encrypted digital communications systems. Proponents of these special modes of entry and intercept into otherwise protected databases and communications believe they are a necessary part of national security in the modern age. However, attempts to statutorily codify these ideas have met significant opposition.
Not to be deterred, the government is currently seeking alternate ways to gather information about suspected criminals and terrorists. Two weeks ago, the Senate passed the Cybersecurity Intelligence Sharing Act (CISA). This bill seeks primarily to permit information technology companies to “voluntarily” share information about security threats with the Department of Homeland Security. Companies would be given immunity both from liability and from FOIA requests regarding this information sharing. A proposed amendment that would have required the scrubbing of personally identifiable information in this information sharing failed to pass.
Back to the Future Day—October 1, 2015—was celebrated this past week to commemorate the day that Marty McFly and Doc Brown traveled through time to save Marty’s future son in Back to the Future II. It’s easy to laugh at the zany fashion and technology—i.e., fax machines—but director Robert Zemeckis got a lot right about 2015. For example, Nike will release a pair of self-lacing sneakers next year, and hover boards have become close to a reality. The film even portrayed a current political candidate as a wacky villain.
While we have yet to reach the Back to the Future-style flying cars depicted in the second film, we are very close to the introduction of self-driving cars into our travel ecosystem. Google’s self-driving car has successfully completed 1 million miles and the company is planning to release a model to the general public by 2017. Automotive powerhouses like GM, Ford, Toyota, Daimler-Chrystler and Volkswagen have all partnered with Google, and Tesla CEO Elon Musk has said that manually-operated cars will be illegal once autonomous cars reach 100% penetration.
While last week’s blog post focused on how companies have used data collected from users to do research, this week’s will focus on how large organizations succeed or fail to protect data from hackers as well as government intrusion.
Following on last week’s post, another dark cloud continues to loom over the Internet: malware. Malware is somewhat two-faced. On one side, hackers use malware to gain access to personal information. On the other side, the government uses malware to track down criminals and terrorists. But what happens when the line separating the two starts to blur? This post will explore the “good” and “bad” sides of deceptive delivery of malware.
We’re back for Season 2 of our ongoing weekly recap of current tech policy news. As always, the TLPC Director (that’s me—Blake Reid) takes on the first blog post of the semester before the TLPC’s student attorneys take over for the duration. As summer comes to a close in Boulder, this post explores some of the dark clouds have circled over the Internet in recent weeks.
Last week, the TLPC testified at several hearings (PDF) in favor of ourproposed exemptions to Section 1201 of the Digital Millennium Copyright Act. We’ve linked below to various pictures and coverage of the hearing. Congratulations to the many TLPC students who took part!
Politico Pro coverage of security research hearing (1, 2)
As we discussed in our previous blog post on the subject, this project seeks an exemption to Section 1201 of the Digital Millennium Copyright Act’s anti-circumvention provisions for good-faith security research. Our Reply Comment responds to a variety of issued raised in the second round of the proceeding by public commenters including manufacturers and trade groups in automobile, medical device, software, and related industries. In our Reply Comment, we focus on how objectors comments are textbook examples of the adverse effects of Section 1201 chilling good-faith security research, and push back against the suggestion that an exemption should include a mandatory disclosure standard.