Sam Gutelle
The increasing presence of AI in creative fields has sparked controversy, particularly when it comes to compensating human creators. Experts say the solution is straightforward yet often overlooked: paying them for their work. Creators have expressed concerns over AI-generated content infringing on their intellectual property and diminishing the value of their original work.
This issue is prevalent in industries like music, writing, and visual arts, where AI tools can easily replicate styles and generate content that competes with human creators. Industry advocates argue that a focus on equitable compensation and proper licensing agreements can help mitigate these tensions. By ensuring that creators receive fair payment for their contributions, the creative ecosystem can continue to flourish alongside advancements in AI technology.
The goal is to foster a landscape where both AI and human creativity can coexist and complement one another, rather than one overshadowing the other. With appropriate measures in place, creators can be assured that their work is valued and protected in an era increasingly dominated by artificial intelligence. A young startup is taking steps to address creators’ fears about AI by compensating them for content used in AI training.
Through its new program called “License to Scrape,” creators who worry about infringement from generative AI models can enter into licensing deals to protect their intellectual property (IP) and receive payment when their content is used for scraping. The licensing deals will resemble those in traditional arts industries like film and music, thanks to the company’s CEO Dave Davis, who previously worked at the Motion Picture Licensing Corporation.
Paying creators for AI data
The company’s creator partners are compensated through a revenue-sharing program, and the company will negotiate sublicensing agreements with AI companies, delivering a portion of the resulting royalties to the creators whose content is scraped. “There’s obvious demand from AI companies to scrape YouTube content,” Davis said. “So what we’re trying to do is to create a tool that makes it legal and simple for them.” The company will need to offer at least 25,000 hours of content before AI companies buy into its business model, favoring bigger creators who can provide large libraries.
One agency supplying the company with creators is Viral Nation, which has received positive feedback from creators, according to Head of Content Licensing Bianca Serafini. Aside from channel size, another issue the company faces is creator skepticism regarding generative AI. Some creators feel that poorly regulated scraping policies allow generative AI models to pirate their content.
To address this concern, the License to Scrape program aims to provide creators with more control over how their content is used for AI training. The startup is committed to ethical practices in AI training and has the potential to expand beyond its current model. For instance, YouTube is working on a system to identify and control AI-generated duplicates of content.
As the company builds its repository of creator data, it could implement a similar program from a third-party perspective. Ultimately, creators need to buy into the program—ideally getting paid in the process—so they can have peace of mind about their content being used ethically in AI training.