Skip to main content

Senators introduce bill to protect individuals against AI-generated deepfakes

Today, a group of senators introduced the NO FAKES Act, a law that would make it illegal to create digital recreations of a person's voice or likeness without that individual's consent. It's a bipartisan effort from Senators Chris Coons (D-Del.), Marsha Blackburn (R-Tenn.), Amy Klobuchar (D-Minn.) and Thom Tillis (R-N.C.), fully titled the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024.

If it passes, the NO FAKES Act would create an option for people to seek damages when their voice, face or body are recreated by AI. Both individuals and companies would be held liable for producing, hosting or sharing unauthorized digital replicas, including ones made by generative AI.

We've already seen many instances of celebrities finding their imitations of themselves out in the world. "Taylor Swift'' was used to scam people with a fake Le Creuset cookware giveaway. A voice that sounded a lot like Scarlet Johannson's showed up in a ChatGPT voice demo. AI can also be used to make political candidates appear to make false statements, with Kamala Harris the most recent example. And it's not only celebrities who can be victims of deepfakes.

"Everyone deserves the right to own and protect their voice and likeness, no matter if you’re Taylor Swift or anyone else," Senator Coons said. "Generative AI can be used as a tool to foster creativity, but that can’t come at the expense of the unauthorized exploitation of anyone’s voice or likeness."

The speed of new legislation notoriously flags behind the speed of new tech development, so it's encouraging to see lawmakers taking AI regulation seriously. Today's proposed act follows the Senate's recent passage of the DEFIANCE Act, which would allow victims of sexual deepfakes to sue for damages. 

Several entertainment organizations have lent their support to the NO FAKES Act, including SAG-AFTRA, the RIAA, the Motion Picture Association, and the Recording Academy. Many of these groups have been pursuing their own actions to get protection against unauthorized AI recreations. SAG-AFTRA recently went on strike against several game publishers to try and secure a union agreement for likenesses in video games.

Even OpenAI is listed among the act's backers. "OpenAI is pleased to support the NO FAKES Act, which would protect creators and artists from unauthorized digital replicas of their voices and likenesses," said Anna Makanju, OpenAI's vice president of global affairs. "Creators and artists should be protected from improper impersonation, and thoughtful legislation at the federal level can make a difference."

This article originally appeared on Engadget at https://ift.tt/6i7QH0l

from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/6i7QH0l
via IFTTT

Comments

Popular posts from this blog

Instagram accidentally reinstated Pornhub’s banned account

After years of on-and-off temporary suspensions, Instagram permanently banned Pornhub’s account in September. Then, for a short period of time this weekend, the account was reinstated. By Tuesday, it was permanently banned again. “This was done in error,” an Instagram spokesperson told TechCrunch. “As we’ve said previously, we permanently disabled this Instagram account for repeatedly violating our policies.” Instagram’s content guidelines prohibit  nudity and sexual solicitation . A Pornhub spokesperson told TechCrunch, though, that they believe the adult streaming platform’s account did not violate any guidelines. Instagram has not commented on the exact reasoning for the ban, or which policies the account violated. It’s worrying from a moderation perspective if a permanently banned Instagram account can accidentally get switched back on. Pornhub told TechCrunch that its account even received a notice from Instagram, stating that its ban had been a mistake (that message itse...

Colorado police identified the serial killer who murdered 4 women 40 years ago after exhuming his body to analyze a DNA sample

A scientist examines computer images of DNA models. Getty Images Police in Colorado have cracked the cold cases of four women killed 40 years ago. Denver PD said genetic genealogy and DNA analysis helped them identify the serial killer. He had died by suicide in jail in 1981. DNA from his exhumed body matched evidence from the murders. Police in Colorado have cracked the code on four murder cases that went unsolved for 40 years, using DNA from the killer's exhumed body. The cases pertain to four women killed in the Denver metro area between 1978 and 1981. They were 33-year-old Madeleine Furey-Livaudais, 53-year-old Dolores Barajas, 27-year-old Gwendolyn Harris, and 17-year-old Antoinette Parks. The four women were stabbed to death. Denver Police Commander Matt Clark said in a press conference Friday that there was an "underlying sexual component" to the murders but didn't elaborate further. In 2009, a detective reviewed Parks' case and picked several p...

Gemini vs. ChatGPT: Which one planned my wedding better?

I was all about the wedding bells after getting engaged in June, but after seeing some of these wedding venue quotes, it’s more like alarm bells. "Ding-dong" has been remixed to "cha-ching" – and I need help. I don’t even know how to begin wedding planning. What are the first steps? What do I need to prioritize first? Which tasks are pressing – and which can wait a year or two? I decided to enlist the help of an AI assistant. Taking it one step further, I thought it’d be interesting to see which chatbot – Gemini Advanced or ChatGPT Plus (i.e., ChatGPT 4o) – is the better wedding planner. Gemini vs ChatGPT: Create a to-do list I’m planning on have my wedding in the summer of 2026 – sometime between August and September. Besides that, I don’t have anything else nailed down, so I asked both Gemini and ChatGPT to give me a to-do list based on the following prompt: “My wedding is between August 2026 and September 2026. Give me a to-do list of things to do for the...