EFF Warns Against Broad “Stay Down” Anti-Piracy Filters

Home > News >

Copyright holders want websites to implement strict filters to guarantee that content stays down after a DMCA notice is received. The EFF warns against these demands, arguing that they will lead to a "filter everything" approach. According to the EFF this will result in more abuse and mistakes from often automated takedown bots.

targetmissThis month the U.S. Government’s Copyright Office launched a public consultation to evaluate the effectiveness of the DMCA’s Safe Harbor provisions.

The study aims to signal problems with the current takedown procedures and addresses the repeat infringer issue that affects ISPs, copyright takedown abuses, and the ever-increasing volume of DMCA notices.

An issue that’s been high on the agenda are requests from copyright holder groups to ensure that content “stays down” after it’s removed. For example, when Google removes a copyrighted image from its search results, it should ensure that it doesn’t reappear under another URL.

This “take down, stay down” approach is being pushed by industry groups including the MPAA and RIAA who believe that the current takedown procedures are not effective.

However, not everyone welcomes tighter rules. In particular, the recent proposals struck a nerve with the Electronic Frontier Foundation (EFF) who warn against such broad copyright filters.

“Now, some lobbyists think that content filtering should become a legal obligation: content companies are proposing that once a takedown notice goes uncontested, the platform should have to filter and block any future uploads of the same allegedly infringing content,” the EFF’s Elliot Harmon notes.

“Filter-everything would effectively shift the burden of policing copyright infringement to the platforms themselves, undermining the purpose of the safe harbor in the first place.”

One of the problems is that copyrighted content may be infringing on one site, but not on another. For example, a video creator may want to take down infringing copies of his work, but that doesn’t mean that all the licensed versions should be removed from the web too.

In addition, the EFF points out that automated takedown tools are far from perfect. The takedown ‘robots’ that copyright holders employ often make mistakes, removing access to content that’s not infringing at all.

“Here’s something else to consider about copyright bots: they’re not very good,” Harmon writes.

“Content ID routinely flags videos as infringement that don’t copy from another work at all. Bots also don’t understand the complexities of fair use. In September, a federal appeals court confirmed that copyright holders must consider fair use before sending a takedown notice,” he adds.

The EFF does agree with copyright holders that the DMCA notice-and-takedown procedure isn’t perfect. But, instead of more strict filtering they would like more safeguards to ensure that free speech and fair use are protected. This is not the case at the moment.

“You don’t need to look far to find examples of copyright holders abusing the system, silencing speech with dubious copyright claims,” Harmon notes.

“Under the filter-everything approach, legitimate uses of works wouldn’t get the reasonable consideration they deserve. Even if content-recognizing technology were airtight, computers would still not be able to consider a work’s fair use status,” he adds.

The above clearly shows that there’s a great divide on how the DMCA takedown process should operate and what changes the U.S. Government should implement.

Considering the parties involved and the stakes at hand, copyright holders, Internet services and ISPs will do everything in their power to convince the Copyright Office that they know what’s best for the future of the Internet.

Sponsors




Popular Posts
From 2 Years ago…