A four-hour video released on YouTube this week attempted to make viewers care about plagiarism on the platform. | NurPhoto via Getty Images
A YouTuber’s deep dive on plagiarism tries to make viewers care when creators steal content.
Copying has always been a part of internet culture. Sometimes it’s ethical, sometimes not. It’s almost always incentivized: Once social media began reshaping online life, copying became a go-to tactic for getting views.
When copying crosses an ethical line, we generally call it plagiarism. And plagiarism is thriving online as well. Get good enough at it — and don’t get caught — and you can make money by simply lifting the hard work of someone else and packaging it as your own. With so much content online, plagiarism can sometimes simply outrun efforts to detect it. The rise of AI-generated content is only piling on to this existing problem.
It’s easy to see how we got here. Memes work by copying and tweaking an existing idea, sound, or image. Viral “challenges” ask people to film themselves literally doing the same thing as someone else, from pouring ice water on their head to performing specific choreography to a song that just blew up on TikTok. If social media success thrives on creating things that other people will want to share, then what better way to ensure clicks than by doing the same thing that worked for someone else?
The line between imitation and plagiarism should be clear. Bad actors try to benefit when it’s not. In the maximalist decor DIY space earlier this year, one influencer publicly accused another of copying her project videos, when it appeared that the two creators may have just happened upon some of the same design trends at the same time. And over the weekend, I watched a nearly four-hour YouTube video hosted by Harry Brewis, who posts as Hbomberguy, that laid out how optimized copying becomes plagiarism, a video that spent a great deal of time analyzing one video essayist in particular: James Somerton, a queer YouTube essayist.
The plagiarism allegations against Somerton are pretty grim in this video, and include instances in which Somerton appeared to copy text from academics working in queer culture and history, a book and documentary on the history of LGBTQ people in film, other queer YouTubers, and essays published across the web, including, it seems, at least two articles from Vox. But one thing struck me about how Brewis approaches this topic: It’s not taken as a given in this video that his audience will care about stolen content.
About 40 minutes into the video, Brewis addresses this directly, telling his viewers that, in part, you should care about plagiarism on YouTube because “internet video isn’t a silly playground where teens pretend to be scared of scary horror games anymore. It’s a business.” Plagiarism of and among creators is stolen labor.
This all brings to mind probably the biggest intellectual property story of the year: How copyright law applies to AI-generated content. A federal court ruled against someone who tried to copyright a piece of art created by generative AI earlier this year, writing that so far “no court has recognized copyright in a work originating with a nonhuman.” Generative AI companies have been hit with a number of class action lawsuits arguing that they have unethically lifted from published works in their training data. But the issue is not settled, and as Axios notes, the volume of work generated by AI is vastly outpacing attempts to decide who gets to profit from it. And while there are plenty of people worried about all sorts of things AI might do, it seems even trickier to get a plagiarism accusation against a machine to stick.
Brewis’s video convinces users to care about Somerton’s apparent plagiarism by looking at who gets harmed: in this case, the less-famous queer writers and YouTubers whose work was seemingly lifted for Somerton’s videos. These writers, Brewis notes, are often not compensated or credited adequately for their ideas in the first place. Having a creator who also is part of the LGBTQ community steal from his peers in order to earn money for himself is a community harm.
There’s no equivalent for AI. AI isn’t part of a community or an occupation that has ethical standards to apply. It might be wrong for a generative AI tool to train on and essentially copy creative works without compensation or permission, but the creators of tools like ChatGPT are generally not participants in the communities they are lifting from in order to train their systems. Perhaps that’s why a lot of the bigger conversations about AI and plagiarism right now seem to focus on students using AI-generated writing to plagiarize their papers.
But AI, like YouTube creation, is a business, run by people who are making money off of its use, including by cheating students and by well-meaning users whose DALL-E prompts might accidentally generate a copy of a work by Greg Rutkowski. Although the legal and ethical issues surrounding these two spaces sound very different, they’re both essentially about stolen labor.
Somerton has seen some short-term consequences from Brewis’s video. He’s lost 50,000 subscribers in the past month, according to SocialBlade, mostly in the past few days. His Patreon and X accounts are now inaccessible. His YouTube channel remains live. Meanwhile, Brewis’s video has nearly 6 million views as of the afternoon of December 6. Does that mean Brewis successfully made people care about plagiarism on the internet?
Perhaps for a little while, at least. The idea that someone would have to make the case to care about online plagiarism implies that, historically, scandals like these have been survivable for creators. Jonathan Bailey, a writer who tracks online plagiarism for Plagiarism Today, said he was “confident” that Somerton, along with another creator discussed in the video, would at least attempt to reignite their careers after attention moves on.
A version of this story was also published in the Vox Technology newsletter. Sign up here so you don’t miss the next one!