WMG’s CEO Lays Out His Vision & Proposed Rules for AI During Senate Hearing on Deepfakes Bill

The U.S. Senate Judiciary Committee convened on Tuesday (April 30) to discuss a proposed bill that would effectively create a federal publicity right for artists in a hearing that featured testimony from Warner Music Group CEO Robert Kyncl, artist FKA Twigs, Digital Media Association (DiMA) CEO Graham Davies, SAG-AFTRA national executive director/chief negotiator Duncan Crabtree-Ireland, Motion Picture Association senior vp/associate general counsel Ben Sheffner and the University of San Diego professor Lisa P. Ramsey.

The draft bill — called the Nurture Originals, Foster Art, and Keep Entertainment Safe Act (NO FAKES Act) — would create a federal right for artists, actors and others to sue those who create “digital replicas” of their image, voice, or visual likeness without permission. Those individuals have previously only been protected through a patchwork of state “right of publicity” laws. First introduced in October, the NO FAKES Act is supported by a bipartisan group of U.S. senators including Sen. Chris Coons (D-Del.), Sen. Marsha Blackburn (R-Tenn.), Sen. Amy Klobuchar (D-Minn.) and Sen. Thom Tillis (R-N.C.).

Related

Warner Music Group (WMG) supports the NO FAKES Act along with many other music businesses, the RIAA and the Human Artistry Campaign. During Kyncl’s testimony, the executive noted that “we are in a unique moment of time where we can still act and we can get it right before it gets out of hand,” pointing to how the government was not able to properly handle data privacy in the past. He added that it’s imperative to get out ahead of artificial intelligence (AI) to protect artists’ and entertainment companies’ livelihoods.

“When you have these deepfakes out there [on streaming platforms],” said Kyncl, “the artists are actually competing with themselves for revenue on streaming platforms because there’s a fixed amount of revenue within each of the streaming platforms. If somebody is uploading fake songs of FKA Twigs, for example, and those songs are eating into that revenue pool, then there is less left for her authentic songs. That’s the economic impact of it long term, and the volume of content that will then flow into the digital service providers will increase exponentially, [making it] harder for artists to be heard, and to actually reach lots of fans. Creativity over time will be stifled.”

Kyncl, who recently celebrated his first anniversary at the helm of WMG, previously held the role of chief business officer at YouTube. When questioned about whether platforms, like YouTube, Spotify and others who are represented by DiMA should be held responsible for unauthorized AI fakes on their platforms, Kyncl had a measured take: “There has to be an opportunity for [the services] to cooperate and work together with all of us to [develop a protocol for removal],” he said.

During his testimony, Davies spoke from the perspective of the digital service providers (DSPs) DiMA represents. “There’s been no challenge [from platforms] in taking down the [deepfake] content expeditiously,” he said. “We don’t see our members needing any additional burdens or incentives here. But…if there is to be secondary liability, we would very much seek that to be a safe harbor for effective takedowns.”

Davies added, however, that the Digital Millennium Copyright Act (DMCA), which provides a notice and takedown procedure for copyright infringement, is not a perfect model to follow for right of publicity offenses. “We don’t see [that] as being a good process as [it was] designed for copyright…our members absolutely can work with the committee in terms of what we would think would be an effective [procedure],” said Davies. He added, “It’s really essential that we get specific information on how to identify the offending content so that it can be removed efficiently.”

Related

There is currently no perfect solution for tracking AI deepfakes on the internet, making a takedown procedure tricky to implement. Kyncl said he hopes for a system that builds on the success of YouTube’s Content ID, which tracks sound recordings. “I’m hopeful we can take [a Content ID-like system] further and apply that to AI voice and degrees of similarity by using watermarks to label content and care the provenance,” he said.

The NO FAKES draft bill as currently written would create a nationwide property right in one’s image, voice, or visual likeness, allowing an individual to sue anyone who produced a “newly-created, computer-generated, electronic representation” of it. It also includes publicity rights that would not expire at death and could be controlled by a person’s heirs for 70 years after their passing. Most state right of publicity laws were written far before the invention of AI and often limit or exclude the protection of an individual’s name, image and voice after death.

The proposed 70 years of post-mortem protection was one of the major points of disagreement between participants at the hearing. Kyncl agreed with the points made by Crabtree-Ireland of SAG-AFTRA — the actors’ union that recently came to a tentative agreement with major labels, including WMG, for “ethical” AI use — whose view was that the right should not be limited to 70 years post-mortem and should instead be “perpetual,” in his words.

“Every single one of us is unique, there is no one else like us, and there never will be,” said Crabtree-Ireland. “This is not the same thing as copyright. It’s not the same thing as ‘We’re going to use this to create more creativity on top of that later [after the copyright enters public domain].’ This is about a person’s legacy. This is about a person’s right to give this to their family.”

Kyncl added simply, “I agree with Mr. Crabtree-Ireland 100%.”

However, Sheffner shared a different perspective on post-mortem protection for publicity rights, saying that while “for living professional performers use of a digital replica without their consent impacts their ability to make a living…that job preservation justification goes away post-mortem. I have yet to hear of any compelling government interest in protecting digital replicas once somebody is deceased. I think there’s going to be serious First Amendment problems with it.”

Elsewhere during the hearing, Crabtree-Ireland expressed a need to limit how long a young artist can license out their publicity rights during their lifetime to ensure they are not exploited by entertainment companies. “If you had, say, a 21-year-old artist who’s granting a transfer of rights in their image, likeness or voice, there should not be a possibility of this for 50 years or 60 years during their life and not have any ability to renegotiate that transfer. I think there should be a shorter perhaps seven-year limitation on this.”

Kristin Robinson

Billboard