FFor those interested in using AI to generate synthetic nudes, YouTube is a great place to start.
Although it doesn't host any “undressing” tools, the video-sharing platform used by 2.5 billion people has posted millions of videos promoting how quickly AI apps and websites can remove clothing from images of women. Reviews reported that it hosted over 100 videos with a high number of views.by forbes found.
Some of the YouTube videos provided tutorials on an app that high school students in Spain and New Jersey allegedly used to generate nudes of their female classmates. The alleged victims have faced bullying, public humiliation, and panic attacks.
Another website featured in many YouTube videos uses artificial intelligence to create child sexual abuse images, leading to child psychiatrist sentenced to 40 years in prison for sexually exploiting minors was cited in court documents in a 2023 case in which he was sentenced. The suspect is accused of using the tool to undress his underage high school girlfriend and alter her image. “It's scary to think that in this digital age, my photo, an innocent photo, could be taken and altered without my consent for illegal and hateful purposes,” he said. The ex-girlfriend testified in court.
“When I see or hear about AI, I think of him in the back of my mind.”
Signy Arnason, deputy executive director of the Canadian Center for Child Protection, said it was “unthinkable” that Google was promoting the use of these apps. forbes. “Even in YouTube and Google search results, you can easily find instructional videos and services with titles that overtly promote these types of applications,” she added. She said her organization is receiving increasing reports from schools where students have been victimized by AI-generated nudity.
Google's AI disabling problem goes beyond YouTube. forbes We've identified three Android apps that remove clothing from photos, including “Nude Scanner Photo Filter”, which has been downloaded over 10 million times. The Spanish-language app, which allows users to “swipe your finger over anything you want to erase, such as a swimsuit,” has been installed more than 500,000 times. Another is the Scanner Body Filter, which adds “sexy body images” to photos, which has also been downloaded 500,000 times.
forbes We also found 27 ads promoting “deep nudity” services in the Google Ads Transparency Center. One site promoted sites with the word “baby” in the URL. The National Center on Sexual Exploitation (NCOSE) provided information on four more cases, including a nude site openly offering to create AI photos of Taylor Swift.
rear forbesWhen asked if the videos, ads, or apps violated Google's policies, all 27 ads were removed, and YouTube removed 11 channels and more than 120 videos. One of those channels is hosted by a male deepfake AI, and he is responsible for more than 90 of these videos, most of them on his Telegram where he shows women undressing. I was referring to the bot. The Scanner Body Filter app was also no longer available for download, but his other Android apps remained online.
Tori Rousay, corporate advocacy program manager and analyst at NCOSE, said Google takes advertising fees from developers and takes a portion of the ad revenue and a lump sum when the app is hosted, thereby promoting the app. He said the invalidation created a “continuous profit cycle”. Google Play Store. In response, when NCOSE highlighted the numbers hosted on the App Store, Apple quickly removed the naked apps, Rousay said.
“Apple listened… Google must do the same,” Rousay added. “Google must develop responsible practices and policies regarding the prevalence of image-based sexual abuse.”
AI-generated deepfake porn is on the rise, including of children. The National Center for Missing and Exploited Children said: forbes This week, the company announced that it received 5,000 reports of AI-generated child sexual abuse material (CSAM) last year. Earlier this year, a Wisconsin man was indicted for allegedly using a Stable Diffusion 1.5 AI-powered image generator to generate his CSAM.
In the case of a convicted child psychiatrist, other victims besides his childhood girlfriend were also accused of being caused by him using AI to undress in childhood photos. She testified in court about her ongoing trauma.
“I'm worried that when he uses my images online to create child pornography, others will have access to those images as well. Colleagues, family members, community members, other children. I'm worried that this image could be accessed by a sex lover,” one victim said. Another added: “I'm scared of artificial intelligence because of him. When I see or hear about AI, he's in the back of my mind.”
More from Forbes