Late 2020, Senator Thom Tillis released a discussion draft of the “Digital Copyright Act” (DCA), which aims to be a successor to the current DMCA.
The DCA hints at far-reaching changes to the way online intermediaries approach the piracy problem. Among other things, these services would have to ensure that pirated content stays offline after it’s taken down once.
This “takedown and staydown’ approach relies on technical protection tools, which include upload filters. This is a sensitive subject that previously generated quite a bit of pushback when the EU drafted its Copyright Directive.
To gauge the various options and viewpoints, the Copyright Office launched a series of consultations on the various technical tools that can help to detect and remove pirated content from online platforms.
This effort includes a public consultation where various stakeholders and members of the public were invited to share their thoughts, which they did en masse.
Thousands of Comments
In total, nearly 6,000 responses came in. These include overviews from the big tech platforms that already use automated takedown tools, such as Google, Microsoft, and Meta Platforms.
Google, for example, provides an overview of the various technical measures it uses to combat copyright infringement. This includes hash filtering on Google Drive, demotion of pirate sites in its search engine, and YouTube’s flagship Content ID system.
Many of these solutions are voluntary and go beyond what’s legally required. However, Google remains critical about mandatory upload filter requirements.
“While we believe our efforts in this space have been effective and targeted, we remain concerned about the potential impact of proposals to condition safe harbors on the implementation of any specific technical measures, in particular the automated filters that would be necessary to operationalize a ‘notice and staydown’ regime,” Google writes.
Effective Takedown Tech?
The consultation also elicited many responses from rightsholders and anti-piracy groups, including the Motion Picture Association. According to the MPA, technical measures can be very effective, if they’re deployed correctly.
“While cross-industry technical measures have thus far not enjoyed great success, we believe the adoption and broad implementation of technical measures can greatly assist in addressing the problem of online copyright piracy,” MPA writes.
The MPA recognizes that any technology carries the risk of abuse and overcorrection but claims that these faulty and fraudulent takedown notices are often the result of human error and intentional abuse, not technology.
Not everyone is confident that automated takedown tools are the answer. The Electronic Frontier Foundation (EFF), for example, stresses that automated filtering tools often miss important context that can differentiate between clear copyright infringement and fair use.
This concern is also broadly supported by comments from the general public, which make up the majority of the consultation responses. It is impractical to provide a detailed summary of the thousands of responses, but from what we have seen, the vast majority are clearly against automated filters.
Public Upload Filter Opposition
One cautioning comment comes from Katy Wood, who notes that automated filters are quite blunt and inefficient.
“The biggest obstacle is that any sort of automatic detection is going to completely lack nuance, which has already been proven time and time again. The answer to copyright issues is not strangling the internet, it is providing better legal protections to those who are stolen from,” Wood writes.
The risk that automated upload filters can target legitimate content is repeated in hundreds of comments, including the examples below.
“Upload filters will only harm innocent users who minimally use copyrighted content under fair use, as memes, jokes and/or video/text reviews,” Aaron Sargent writes.
“Automatic filters are a bad idea for copyrighted content because they do not factor in context or identity,” B. O. says.
“No service should ever be using automated filters because *they don’t work*. They take down lots and lots of 100% legal independent content, including expressive political speech which is accorded the highest level of first amendment protection,” Nathanael Nerode writes.
“Filters are unable to factor in things like context, which leaves them routinely filtering out legal speech. This restricts one of our inalienable rights, the right to free speech and expression,” Mary Weien adds.
There are also comments from people who share their personal experiences. Bruce Ryan, for example, who’s in his late 50s. He converted some Super 8 films from his youth that were recorded by his now 92-year-old father.
The videos were uploaded to YouTube but were not publicly listed. However, that didn’t prevent them from being flagged by the Content ID system.
“In a scene with a chaotic childhood baseball game, my father narrated it over background banjo music. The filter had automatically scanned that 1-minute clip of a 50+ year old instrumental and declared my father’s use of it in his home video to be a copyright violation,” Ryan writes.
“YouTube gave me the option of stripping *all* the audio from the clip I uploaded which would have also removed my father’s narration. Losing my father’s voice would have severely diminished the joy of these old home videos,” he adds.
Artists Don’t Want Filters
For balance, we also searched for submissions from individual artists. While there are larger groups advocating for more automated protection, the individual comments we found were critical of upload filters as well.
Several creators argue that stricter upload controls will mainly benefit large corporations, but not so much individual artists.
Cecilia Ross, who describes herself as a private artist, is one of the people warning against large companies being given more control over what’s posted online.
“From what I’ve seen, most technical measures put private content creators and rightsholders at a disadvantage. Any company can claim something as their own and it’s difficult for private artists to protect their work from predatory companies,” Ross notes.
Another artist, Melissa Fitzgerald, stresses that piracy is a terrible thing. However, in fighting it, fair use shouldn’t be overlooked.
“The [filtering] systems are not nuanced enough to recognize fair use. If measure[s] are enacted to police copyright more on the internet it will only benefit the already wealthy and powerful.
“The other result of this is fair use will be squished and the internet will be in the grip of a few companies that are already becoming monopolies. The public domain needs to stay in the hands of the public, intellectual property needs to be let out into the world and remixed,” Fitzgerald adds.
Many other artists and creators share the same outlook. The general consensus, based on the comments we’ve seen, is that protecting copyright is a good idea, but not through automated systems that mostly benefit large corporations.
While we haven’t read all 6,000 comments in detail, the overall impression we get is that the public at large doesn’t see automatic upload filters as a good idea. But by now, that should hardly come as a surprise.