Apple, Google removed AI nude deepfake apps from app stores

0
9

Photo: Thomas Trutschel/Photothek (Getty Images)

Apple and Google have removed apps that promised deepfake nudes from the App Store and Google Play, according to a report from 404 Media.

404 Media earlier this week found that Instagram was hosting advertisements for apps that use AI to generate deepfake nudes. One such ad used a picture of Kim Kardashian with the slogan, “Undress any girl for free. Try It.” That service, and others like it, appear to allow users to upload real photos of women and generate fake pictures of them naked. 404 Media’s report came just weeks after Meta moved to make it harder to send and receive nudes through Instagram Messenger, in an effort it said was intended to protect teen users from predators.

Deepfake nudes have been a big problem in the early age of generative AI. AI-generated nudes of Taylor Swift flooded X in January, with one such photo garnering tens of millions of views and 24,000 reposts. Deepfake nudes of teen girls at schools in New Jersey, Washington, and California prompted investigations in late 2023. So far, about two dozen U.S. states have introduced legislation to crack down on sexually explicit AI-generated content. And last month, two teen boys in Miami, Florida were arrested for using so-called “undress” apps to make nudes of their classmates, and they were charged with third-degree felonies.

There’s no federal law on the books regarding AI-made nudes, but U.S. senators introduced legislation in January that would allow victims to sue their perpetrators.

“Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit deepfakes is very real,” said U.S. senators Richard Durbin and Lindsey Graham in their announcement of the bill dubbed the DEFIANCE Act. “Victims have lost their jobs, and may suffer ongoing depression or anxiety.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here