YouTube’s non-solution for AI podcasts

This is Hot PodThe Verge’s newsletter about podcasting and the audio industry. Sign up here for more.

Today we have a look at how YouTube’s new AI rules will (and will not) affect podcasting, another late-night host ditching TV for audio, and a bunch of your recs on podcasts covering the Israel-Hamas war.

What do YouTube’s new AI rules mean for podcasts? Not much.

This morning, YouTube released new terms regarding AI-generated content on its platform. As my Verge colleagues Mia Sato and Nilay Patel reported, the company is creating a two-tier system in the way it moderates such content: a strict set of rules for music and a looser, nearly unenforceable standard for everything else (including podcasts). For creators who make podcasts using AI and people who may discover an AI-generated clone of their voice on the internet, there are very slight new rules to which to adhere.

First, podcasts that use “realistic” AI-generated (or altered) content have to label their videos as such. That is something that is already happening with some of the bigger podcasts that use AI, like with The Joe Rogan AI Experience, but it’s generally good practice, so no harm in requiring people to do it. Even with the labeling, though, people can request YouTube take down videos that “simulate an identifiable individual, including their face or voice.” It is then up to YouTube’s discretion, based on factors such as whether the content counts as satire or if the person being replicated is a public figure. Music, meanwhile, has no such exceptions because YouTube needs to keep the labels happy (if you can believe it, the podcast lobby is somewhat less influential).

These guidelines, which will roll out next year, are being issued in the absence of any real legal framework for dealing with AI-generated content. While it does seem like an attempt by YouTube to do something, its effectiveness is necessarily limited — and the lack of clarity could lead to some confusing and inconsistent enforcement decisions.

“It doesn’t have the weight of law, and it doesn’t have the advantage of being done out in the open,” says attorney Emily Poler, who handles copyright infringement cases. “There are going to be situations where it’s really hard [for YouTube] to make a principled decision, and those [decisions] will be entrusted to some reasonably low level employee at YouTube. I don’t think that that’s a recipe for success.”

Moderation was already a mess for these platforms before AI got involved, and each one is taking a different approach. While Spotify is quite permissive (and even encouraging!) of AI spoken-word content, Audible has a blanket rule against AI-narrated audiobooks. YouTube seems to be attempting some middle ground. I’ll be curious to…