YouTube Removes Fake AI Trailers, Prompting Fears Of AI Music Takedowns
Prior to their removal, YouTube had placed both Screen Culture and KH Studio under temporary monetisation restrictions
Prior to their removal, YouTube had placed both Screen Culture and KH Studio under temporary monetisation restrictions
YouTube has begun removing some of the channels responsible for producing and distributing unauthorised AI-generated film trailers, prompting renewed questions about whether a similar crackdown on AI-driven music accounts could follow.
According to Deadline, the video platform has terminated Screen Culture and KH Studio,two of the most prominent creators of AI-generated “fan trailers”,after sustained criticism from studios and audiences. The backlash centred on the reach of these videos and their tendency to confuse viewers or divert attention from official promotional content.
Screen Culture, for instance, reportedly uploaded as many as 23 fabricated trailers for The Fantastic Four: First Steps, with several ranking above the official trailer in search results. The studio’s legitimate trailer, by comparison, has accumulated around 17 million views.
Prior to their removal, YouTube had placed both Screen Culture and KH Studio under temporary monetisation restrictions. Those restrictions were lifted after the channels began adding disclaimers such as “fan trailer” or “parody” to their video titles. However, Deadline reports that the channels later reverted to their earlier practices, ultimately leading to their termination. Together, the two channels are said to have generated close to one billion views.
The move has drawn attention to a broader and increasingly urgent issue across digital platforms: the unchecked growth of AI-generated music. A growing volume of machine-made audio is flooding digital service providers (DSPs), competing for listeners, playlist placements and royalty share with human artists.
While AI tracks that clearly infringe copyright,or achieve significant commercial success,are often removed or face legal scrutiny, lower-performing AI-generated songs typically remain online. Despite modest view or stream counts, such content continues to proliferate across platforms like YouTube and Spotify.
This is especially visible in the surge of AI-generated cover songs and extended videos “inspired by” artists such as Etta James, B.B. King and Eric Clapton. Although these low-quality outputs are unlikely to draw fans away from the original artists, they can still influence algorithms, distort recommendations and crowd search results.
Industry observers note that much of this content is produced by a relatively small number of high-volume uploaders, with no clear slowdown in sight. With thousands of AI-generated tracks reportedly landing on DSPs each day, pressure is mounting on platforms to respond more decisively.
Against this backdrop, a future wave of takedowns targeting AI-driven music accounts,on YouTube and beyond,no longer seems improbable, with 2026 increasingly being viewed as a potential turning point for stricter enforcement.