A spokesperson for the City of Beverly Hills announced Wednesday that the city has launched an investigation into reports that a student at the middle school created fake nude photos of his classmates.
Keith Sterling, assistant city manager for the city of Beverly Hills, said authorities are investigating a student at Beverly Vista Middle School who authorities say used artificial intelligence tools to create images and shared them with other students. He said there was.
In a letter to parents, the school district's superintendent said school officials were made aware of the student's “AI-generated nude photos” last week.
Following a spate of similar AI-generated nude photo incidents at high schools around the world, students and parents told NBC News they were scared to go to school or send their children to school after the incident. Ta. With the advent of sophisticated and easily accessible apps and programs that “undress” or “undress” photos, and “face-swapping” tools that superimpose the victim's face onto pornographic content, non-consensual activities primarily target women. This has led to an explosion of sexually explicit deepfakes. And the girls.
Mary Ann Franks, president of the Cyber ​​Civil Rights Initiative and a professor at George Washington University Law School, previously told NBC News that AI-generated nude photos of students depend on the facts of the case and what the images depict. said it could be illegal. .
For example, Franks said, a criminal case could involve sexual harassment, or the material could be considered child sexual abuse material (CSAM, a term experts and advocates prefer over child pornography). He said there is. While not all nude photos of children, AI-generated or not, fall under the legal definition of CSAM, some do, such as some AI-generated depictions of her. For a depiction to be illegal, it must show sexually explicit conduct, which is a higher hurdle than just nudity.
“Federal and other laws prohibit the depiction of a real child's face or other parts of the body mixed with other objects,” Franks said. “Depending on the facts, this could amount to harassment or stalking.”