By EFF Deeplinks Blog
The latest on the Computer Fraud and Abuse Act? It’s still terrible. And this year, the detrimental impacts of the notoriously vague and outdated criminal computer crime statute showed themselves loud and clear. The statute lies at the heart of the Equifax breach, which might have been averted if our laws didn’t criminalize security research. And it’s at the center of a court case pending in the Ninth Circuit Court of Appeals, hiQ v. LinkedIn, which threatens a hallmark of today’s Internet: free and open access to publicly available information.
At EFF, we’ve spent 2017 working to make sure that courts and policy makers understand the role the CFAA has played in undermining security research, and that the Ninth Circuit rejects LinkedIn’s attempt to transform a criminal law meant to target serious computer break-ins into a tool for enforcing corporate computer use policies. We’ve also continued our work to protect programmers and developers engaged in cutting-edge exploration of technology via our Coders’ Rights Project—coders who often find themselves grappling with the messiness that is the CFAA. As this fight carries us into 2018, we stand ready to do all we can to rein in the worst law in technology.
The CFAA makes it illegal to engage in “unauthorized access” to a computer connected to the Internet, but the statute doesn’t tells us what “authorization” or “without authorization” means. This vague language might have seemed innocuous to some back in 1986 when the statute was passed, but in today’s networked world, where we all regularly connect to and use computers owned by others, courts cannot even agree on what the law covers. And as a result, this pre-Web law is causing serious problems.
One of the biggest problems: the law notorious for chilling the work of security researchers.
Most of the time, we never hear about the research that could have prevented a security nightmare. But with Equifax’s data breach, we did. As if the news of the catastrophic breach wasn’t bad enough, we learned in October—thanks to reporting by Motherboard—that a security researcher had warned Equifax “[m]onths before its catastrophic data breach . . . that it was vulnerable to the kind of attack that later compromised the personal data of more than 145 million Americans[.]” According to Equifax’s own timeline, the company didn’t patch the vulnerability for six months—and “only after the massive breach that made headlines had already taken place[.]”
The security researcher who discovered the vulnerability in Equifax’s system back in 2016 should have been empowered to bring their findings to someone else’s attention after Equifax ignored them. If they had, the breach may have been avoided. Instead, they faced the risk of a CFAA lawsuit and potentially decades in federal prison.
In an era of massive data breaches that impact almost half of the U.S. population as well as people around the globe, a law that ostracizes security researchers is foolish—and it undermines the security of all of us. A security research exemption is necessary to ensure that our security research community can do their work to keep us all safe and secure without fear of prosecution. We’ve been calling for these reforms for years, and it’s long overdue.
hiQ v. Linkedin
One thing that’s consistently gotten in the way of CFAA reform: corporate interests. And 2016 was no different in this respect. This year, LinkedIn has been pushing to expand the CFAA’s already overly broad scope, so that it can use the statute to maintain its edge over a competing commercial service, hiQ Labs. We blogged about the details of the disputeearlier this year. The social media giant wants to use the CFAA to enforce its corporate policy against using automated scripts—i.e., scraping—to access publicly available information on the open Internet. But what that would mean is potentially criminalizing automated tools that we all rely on every day. The web crawlers that power Google Search, DuckDuckGo, and the Internet archive, for instance, are all automated tools that collect (or scrape) publicly information from across the Web. LinkedIn paints all “bots” as bad, but they are a common and necessary part of the Internet. Indeed, “good bots” were responsible for 23 percent of global Web traffic in 2016. Using them to access publicly available information on the open Internet should not be punishable as a federal felony.
LinkedIn’s expansive interpretation of the CFAA would exacerbate the law’s chilling effects—not only for the security research community, but also for journalists, discrimination researchers, and others who use automated tools to support their socially valuable work. Similar lawsuits are already starting to pop up across the country, including one by airline RyanAir alleging that Expedia’s fair scraping violated the CFAA.
Luckily, a court in San Francisco called foul, questioning LinkedIn’s use of the CFAA to block access to public data, finding that the “broad interpretation of the CFAA invoked by LinkedIn, if adopted, could profoundly impact open access to the Internet, a result that Congress could not have intended when it enacted the CFAA over three decades ago.”
The case is now on appeal, and EFF, DuckDuckGo, and the Internet Archive have urged the Ninth Circuit Court of Appeals to uphold the lower court’s finding and reject LinkedIn’s shortsighted request to transform the CFAA into a tool for policing the use of publicly available data on the open Internet. And we’re hopeful it will. During a Ninth Circuit oral argument in a different case in July, Judge Susan Graber pushed back [at around 33:40] on Oracle’s argument that automated scraping was a CFAA violation.
LinkedIn says it wants to protect the privacy of user data. But public data is not private, so why not just put the data behind its pre-existing username and password barrier? It seems that LinkedIn wants to take advantage of the benefits of the open Internet while at the same time abusing the CFAA to avoid the Web’s “open trespass norms.” The CFAA is an old, blunt instrument, and trying to use it to solve a modern, complicated dispute between two companies will undermine open access to information on the Internet for everyone. As we said in our amicus brief:
The power to limit access to publicly available information on the Internet under color of the law should be dictated by carefully considered rules that balance the various competing policy interests. These rules should not allow the handful of companies that collect massive amounts of user data to reap the benefits of making that information publicly available online—i.e., more Internet traffic and thus more data and more eyes for advertisers—while at the same time limiting use of that public information via the force of criminal law.
The Ninth Circuit will hear oral argument on the LinkedIn case in March 2018, and we’ll continue to fight LinkedIn’s expansive interpretation of the CFAA into the New Year.