Microsoft engineers have submitted safety concerns about the company's AI image generator to the Federal Trade Commission, CNBC reports. Shane Jones, who has worked at Microsoft for six years, wrote a letter to the FTC saying that Microsoft “refused” to remove Copilot Designer despite repeated warnings that the tool could produce harmful images. said.
According to a report from CNBC, Jones tested Copilot Designer for safety issues and flaws and found that the tool was found to be “abortive rights, teens with assault rifles, and women in violent paintings.” The report found that it generated “demons and monsters” along with sexual images, terms related to underage drinking and drug use. .
Additionally, Copilot Designer reportedly produced images of Disney characters such as Elsa in the movie. frozen, in a scene “in front of destroyed buildings and 'Free Gaza' signs” in the Gaza Strip. An image was also created of Elsa wearing an IDF uniform and holding a shield emblazoned with the Israeli flag. The Verge I was able to generate a similar image using this tool.
According to CNBC, Jones has been trying to warn Microsoft about the DALLE-3, the model used in Copilot Designer, since December. He posted an open letter on LinkedIn about the issue, but was reportedly contacted by Microsoft's legal team to remove the post.
“Over the past three months, I have repeatedly asked Microsoft to remove Copilot Designer from public use until better safeguards are in place,” Jones said in a letter obtained by CNBC. Ta. “Again, they have failed to implement these changes and continue to sell their products to 'anyone.' anywhere. Any device.'' Microsoft did not immediately respond. The VergeThis is a comment request from .