Google has admitted that its Gemini AI model “missed the mark” after a flurry of criticism for what many perceived as “anti-white bias.” Many users reported that the system was generating images of people of diverse ethnicities and genders. When it was historically inaccurate to do so.company Said On Thursday, it planned to “pause” its ability to generate images of people until a fix is rolled out.
When prompted to create an image of a Viking, Gemini showed only a black man in traditional Viking costume. At the behest of the Founding Fathers, Native Americans returned home wearing colonial costumes. Another result depicted George Washington as black. When asked to produce image In the Pope's system, only people of ethnicities other than white were represented.In some cases, Gemini Said I could never produce images of historical figures like Abraham Lincoln, Julius Caesar, or Galileo.
Many right-wing commentators have jumped on the issue, suggesting this is further evidence of Big Tech's anti-white bias, with entrepreneur Mike Solana saying “Google's AI is an anti-white lunatic.” writing.
But this situation mainly highlights that generative AI systems are not very smart.
“I think this is just terrible software,” Gary Marcus, a professor emeritus of psychology and neuroscience at New York University and an AI entrepreneur, wrote on Substack on Wednesday.
Google announced its Gemini AI model two months ago as a rival to OpenAI's dominant GPT model, which powers ChatGPT. Last week Google rolled out a major update with the limited release of Gemini Pro 1.5. This allows users to handle huge amounts of audio, text, and video input.
Gemini also created historically incorrect images, such as one depicting a woman and a black man depicting the Apollo 11 crew.
Google admitted on Wednesday that its systems were not working properly.
“We are working to improve this type of depiction immediately,” Jack Kraczyk, Google's senior director of product management for Gemini Experiences, told WIRED in an emailed statement. “His AI image generation in Gemini generates a variety of people, which is generally a good thing since people all over the world are using it. But here it misses the point.”
Krawczyk further explained the situation. Post to X: “We design our image generators to reflect our global user base, and we take representation and bias seriously. We will continue to do this for free-form prompts. (Images of dog walkers are universal!) The historical context is more nuanced, and we make further adjustments to accommodate that.”
He also directly responded to some critics by providing screenshots of his own interactions with Gemini. suggested that mistakes are not universal.
But the problems Gemini caused were quickly exploited by online anti-woke campaigners, who variously accused Google of being “racist” and “infected with the woke mind virus.” insisted.