On Thursday, a group of US bipartisan senators agreed to put together a bill that, if passed, would make it illegal for anyone to make an AI replica of another person without their consent. The new bill, which will become federal law when passed, is dubbed the ‘No Fakes Act.’
This follows an increase in AI replication like the fake Drake and The Weeknd Track “Heart on My Sleeve,” which was produced without the two artists’ consent, raising ethical concerns around the use of AI in the art and entertainment industries.
The bill will therefore stop the “production of a digital replica without consent of the applicable individual or rights holder.”
Protection against AI replication
According to an article by Decrypt, the bill was drafted by Senators Marsha Blackburn, Thom Thillis, Amy Klobuchar, and Chris Coons.
The bill is in response to artists’ disgruntlement over the use of AI technology in the industry on ethical grounds, among others.
Generative AI has allowed ordinary people without experience in the art industry to experiment with the technology to create their own songs and memes mimicking other people’s likeness without their consent.
Now the proposed bill, if passed into law, will make it illegal to generate content using someone’s likeness without their permission. Senator Coons said in a statement that there has been pressure on Congress to act.
“Creators around the nation are calling on Congress to lay out clear policies regulating the use and impact of generative AI, and Congress must strike the right balance to defend individual rights, abide by the First Amendment, and foster AI innovation and creativity,” he said.
A fine of up to $5,000 per violation would be paid, along with any additional charges to the affected party because of the violation.
According to the Decrypt article, Senator Coons’ statement specifically cited the song “Heart on My Sleeve” as the “exact type of content that would be made illegal under the No Fakes Act.
Released earlier this year, the AI-generated song featuring the voices of Drake and The Weeknd went viral and triggered debates on the use of AI in art and entertainment, with artists like Ice Cube labeling AI as demonic.
Ice Cube made it clear he would take legal action against anyone using his voice or style to produce AI music, as well as the distributors of such content.
Drake and The Weeknd’s song was later removed from streaming platforms like Spotify after an outcry from stakeholders like the Universal Music Group. Later, its creator, known as the Ghostwriter, attempted to push the song for a Grammy nomination, but it was dropped on technicalities.
The song was viewed 8.5 million times on TikTok a week after it was posted and played 254,000 times on Spotify. Other artists who have fallen victim to generative AI include Rihanna and Kanye.
Recently, Hollywood actor Tom Hanks disowned a video with an AI version of himself in a dental ad. He said he was not a part of the production of the video, further bringing to light the pitfalls of the technology, which has been one of the reasons for the strikes in Hollywood.
Support from stakeholders
The Hollywood actors’ union has been on strike since July over, among other concerns, the lack of a clear AI regulation in the industry and the decision by studios to use AI to replace extras on movies while owning their likeness forever.
Now, the union SAG-AFTRA has already weighed in on the establishment of a No Fakes Act to help in AI regulation.
“A performer’s voice and their appearance are all part of their unique essence, and it’s not OK when those are used without their permission,” said SAG-AFTRA president Fran Drescher in a statement.
“Consent is key, and I’m grateful that Sens. Coons, Blackburn, Klobuchar, and Thillis are working to give performers recourse and provide tools to remove harmful material.”
Platforms also liable
In a nutshell, the bill seeks “to prevent a person from producing or distributing an unauthorized AI-generated replica of an individual to perform in an audiovisual or sound recording without the consent of the individual being replicated.”
This means that platforms are also not spared once they are found to be distributing such content. According to Musically, this means that the likes of YouTube, TikTok, Spotify, and all the other services where deepfakes have been distributed recently would be liable.