If a California eighth-grader shares a nude photo of a classmate with a friend without their consent, the student could be charged under state laws dealing with child pornography and disorderly conduct.
However, it is not clear whether the photo is an AI-generated deepfake. Any State law will apply.
That's the dilemma facing the Beverly Hills Police Department as it investigates a group of Beverly Vista Middle School students who allegedly shared photos of their classmates that had been altered using an artificial intelligence-powered app. According to the school district, the student's real face was superimposed on top of an AI-generated nude body in the image.
Lt. Andrew Myers, a spokesman for the Beverly Hills Police Department, said no arrests have been made and the investigation is ongoing.
Beverly Hills Unified School District Assistant Superintendent Michael Breggie said the district's investigation into the episode is in its final stages.
“We are pleased that disciplinary action was taken immediately and that this was a subdued and isolated incident,” Bregie said in a statement, adding that information such as the nature of the disciplinary action, the number of students involved, and the grade levels are not disclosed. It has not been.
He urged Congress to prioritize the safety of America's children, saying, “Technology, including AI and social media, can be used in an incredibly positive way, but like cars and tobacco, unregulated It’s devastating,” he added.
But whether fake nudity constitutes a criminal offense is complicated by the technology involved.
Under federal law, the prohibition on child pornography includes computer-generated images of identifiable persons. Legal experts warn that while the prohibition is clear, it has not yet been tested in court.
California's child pornography law does not address artificially generated images. Instead, it applies to any image that “depicts a person under the age of 18 privately engaging in or imitating a sexual act.”
Santa Ana criminal defense attorney Joseph Abrams said the AI-generated nudes “do not depict real people.” Although this could be defined as child erotica, he said it was not child pornography. And in his capacity as an attorney, he said, “I don't think it crosses the line with this particular statute or any other statute.”
“As we move into this age of AI, these kinds of issues are going to be litigated,” Abrams said.
Kate Ruane, director of the Freedom of Expression Project at the Center for Democracy and Technology, said early versions of digitally altered child sexual abuse material included pornographic images of other people's bodies superimposed on children's faces. said. But now freely available “undressing” apps and other programs are generating fake bodies to match real faces, raising legal issues that have yet to be addressed head-on, she said. Ta.
Still, she said it's hard to understand why the law doesn't cover sexual images just because they're artificially generated. “The evil we were trying to address was [with the prohibition] It is the harm to children associated with the presence of images. It’s exactly the same here,” Luan said.
But there are other obstacles to criminal prosecution. In both state and federal cases, the ban applies only to “explicit sexual conduct,” which boils down to sexual intercourse, other sexual acts, and “indecent” exposure of a child's private parts.
Courts consider whether something is an indecent display by considering things such as what the image focuses on, whether the pose is natural, and whether the image is intended to arouse the viewer. We use a six-pronged test to determine. Courts will need to weigh these factors when evaluating images that are not sexual in nature before being “undressed” by AI.
“It's really going to depend on what the final picture looks like,” said Sandy Johnson, senior legislative policy adviser for the Rape, Abuse and Incest National Network, the largest anti-sexual violence organization in the United States. Told. “It's not just a nude photo.”
Abrams said the age of the children involved is no defense against conviction because “children do not have the same rights as adults to possess child pornography.” However, like Johnson, he pointed out that “nude photographs of children are not necessarily child pornography.”
Neither the Los Angeles County District Attorney's Office nor the state Department of Justice immediately responded to requests for comment.
State lawmakers have proposed several bills to fill gaps in the law regarding generative AI. These include proposals to extend criminal prohibitions on possession of child pornography and the non-consensual distribution of intimate images (also known as 'revenge porn') to computer-generated images, as well as 'related issues and It includes a proposal to convene a working group of academics to advise lawmakers on the issue. The impact of artificial intelligence and deepfakes. ”
Lawmakers have proposed competing proposals that would expand federal criminal and civil penalties for distributing intimate AI-generated images without consent.
At Tuesday's district school board meeting, Dr. Jane Tabiev Asher, chair of the Department of Child Neurology at Cedars-Sinai, told the school board that “children should have access to so much technology in and out of the classroom.” He called on the government to consider the impact of “giving.'' .
Asher said students will be able to spend their free time at school on their own devices instead of interacting and socializing with other students. “If they were on a screen all day, what would they like to do at night?”
Research shows that children under 16 should not use social media, she said. She pointed out how the district has been blindsided by coverage of AI-generated nudity, and she warned: And we have to protect our children from it. ”
Board members and Mr. Bresey all expressed outrage about the image at the meeting. “This challenges the foundation of trust and safety that we strive for every day for all of our students,” Breggie said, but added: Happening. “
“I would like parents to continue to check on their children. [children’s] Your phone, what apps you have on your phone, what you send, what social media sites you use,” he said. These devices “open the door to a lot of new technologies that are coming out completely unregulated.”
Board member Rachel Marcus pointed out that the district prohibits students from using cell phones at school. I think we as parents need to have more control over what our students are doing on their phones, and we are completely failing in that regard. ”
“From my perspective, the missing piece right now is partnership with parents and families,” board member Judy Manucelli said. “There are lots of programs to get kids off their phones in the afternoon.”