As AI Grows, Artists & Labels Consider: Who Owns a Voice?

In April, Grimes encouraged artists to make music using her voice — as replicated by artificial intelligence-powered technology. Even as she embraced a high-tech future, however, she noted that there were some old-fashioned legal limitations. “I don’t own the rights to the vocals from my old albums,” she wrote on X. “If you make remixes, they may get taken down.”

Artificial intelligence has dominated the hype cycle in 2023. But most signed artists who are enthusiastic about testing out this technology will have to move cautiously, wary of the fact that preexisting contracts may assert some level of control over how they can use their voice. “In general, in a major label deal, they’re the exclusive label for name, likeness and voice under the term,” says one veteran manager who spoke on the condition of anonymity. “Labels might be mad if artists went around them and did a deal themselves. They might go, ‘Hey, wait a minute, we have the rights to this.'”

Related

On the flip side, labels probably can’t (or won’t) move unilaterally either. “In our agreements, in a handful of territories, we’ve been getting exclusive name, image, likeness and voice rights in connection with recordings for years,” says one major label source. That said, “as a practical matter, we wouldn’t license an artist’s voice for a voice model or for any project without the artists being on board with it. It would be bad business for us.”

For the moment, both sides are inching forward, trying to figure out how to “interpret new technology with arcane laws,” as Arron Saxe, who manages several artists’ estates, puts it. “It’s an odd time because the government hasn’t stepped in and put down real guidelines around AI,” adds Dan Smith, general manager of the dance label Armada Music. 

That means guidelines must be drawn via pre-existing contracts, most of which were not written with AI in mind, and often vary from one artist to the next. Take a recent artist deal sent out by one major label and reviewed by Billboard: Under the terms, the label has the “exclusive right to record Artist Performances” with “performance” broadly defined to include “singing, speaking… or such performance itself, as the context requires.” The word “recording” is similarly roomy: “any recording of sound…by any method and on any substance or material, whether now or hereafter known.” 

Related

Someone in this deal probably couldn’t easily go rogue and build a voice-cloning model on newly recorded material without permission. Even to participate in YouTube’s recently announced AI voice generation experiment, some artists needed to get permission in form of a “label waiver,” according to Audrey Benoualid, a partner at Myman Greenspan Fox Rosenberg Mobasser Younger & Light. (In an interview about YouTube’s new feature, Demis Hassabis, CEO of Google Deepmind, said only that it has “been complicated” to negotiate deals with various music rights holders.) Even after an artist’s deal ends, if their recordings remain with a label, they would have to be careful to only train voice-cloning tech with material that isn’t owned exclusively by their former record company. 

It’s not just artists that are interested in AI opportunities, though. Record labels stand to gain from developing licensing deals with AI companies for their entire catalogs, which could in turn bring greater opportunities for artists who want to participate. At the Made on YouTube event in September, Warner Music Group CEO Robert Kyncl said it’s the label’s “job” to make sure that artists who lean into AI “benefit.” At the same time, he added, “It’s also our job together to make sure that artists who don’t want to lean in are protected.” 

In terms of protections, major label deals typically come with a list of approval rights: Artists will ask that they get the chance to sign off on any sample of their recordings or the use of one of their tracks in a movie trailer. “We believe that any AI function is just another use of the talents’ intellectual property that would take some approval by the creator,” explains Leron Rogers, a partner at Fox Rothschild.

Related

In many states, artists also have protection under the “right of publicity,” which says that people have control over the way others can exploit their individual identities. “Under that umbrella is where things like the right to your voice, your face, your likeness are protected and can’t be mimicked because it’s unfair competition,” says Lulu Pantin, founder of Loop Legal. “But because those laws are not federal, they’re inconsistent, and every state’s laws are slightly different” — not all states specifically call out voices, for example —  “[so] there’s concern that that’s not going to provide robust protection given how ubiquitous AI has become already.” (A lack of federal law also limits the government’s ability to push for enforcement abroad.) 

To that end, a bipartisan group of senators recently introduced a draft proposal of the NO FAKES act (“Nurture Originals, Foster Art, and Keep Entertainment Safe”), which would enshrine a federal right for artists, actors and others to take legal action against anyone who creates unauthorized “digital replicas” of their image, voice, or likeness. “Artists would now gain leverage they didn’t have before,” says Mike Pelczynski, who serves on the advisory board of the company voice-swap.ai. 

While the entertainment industry tracks NO FAKES’ progress, Smith from Armada believes “we will probably start to see more artist agreements that are addressing the use of your voice.” Sure enough, Benoualid says that in new label deals for her clients, she now asks for approval over any use of an artist’s name, likeness, or voice in connection with AI technology. “Express written approval should be required prior to a company reproducing vocals, recordings, or compositions for the purpose of training AI platforms,” agrees Matthew Gorman, a lawyer at Cox & Palmer. 

Related

Pantin has been keeping an eye on the way other creative fields are handling this fast-evolving tech to see if there are lessons that can be imported into music. “One thing that I’ve been trying to do and I’ve had success in some instances with is asking the rights holders — the publishers, the labels — for consent rights from the individual artists or songwriter before their work is used to train generative AI,” she says. “On the book publishing side, the Authors Guild has put forth language they recommended are included in all publishing agreements, and so I’m drawing from that and extending that to songwriting.”

All these discussions are new, and the long-term impact of AI-driven technology on the creative fields remains unclear. Daouda Leonard, who manages Grimes, is adamant that in the music industry’s near future, “the licensing of voice is going to become a valuable asset.” Other are less sure — “nobody really knows how important this will be,” the major label source says. 

Perhaps Grimes put it best on X: “We expect a certain amount of chaos.”

Elias Leight

Billboard