Yesterday, we reported that Meta's AI image generator was making everyone Asian, even when the text prompt specified a different race. Today I temporarily ran into the opposite problem. Any Asian people using the same prompts as the day before.
The test I did yesterday was done on Instagram via the AI image generator available in Direct Messages. After dozens of tries, I couldn't generate a single accurate image using prompts like “Asian man and white friend” or “Asian man and white wife.” Only once did the system successfully produce a photo of an Asian woman and a white man. The system continued to make everyone Asian.
After I first reached out for comment yesterday, Meta's spokesperson asked for more details about my story, including when the deadline was. I replied, but there was no response. Today I was curious if the problem has been resolved or if the system is still unable to create accurate images that show my Asian and Caucasian friends. Instead of seeing a bunch of racially inaccurate photos, I got an error message like this: Please try again later or try a different prompt. ”
strange.Have you crossed any limits in generating fake Asians? Verge A colleague also tried it and she had the same results.
We also tried some more general questions about Asian people, such as “Asian man in a suit,” “Asian woman shopping,” and “Asian woman smiling.” Instead of an image, I got the same error message. Once again, I reached out to Meta's communications team. What do you get? Let me make a fake Asian! (During this time, I was also unable to generate images using prompts such as “Latino American man in a suit” or “African American man in a suit.” I also asked Meta about this. Ta.)
Forty minutes after I left the meeting, I still hadn't heard from Meta. But by then, Instagram's features were working for simple prompts like “Asian male.” Silently changing something, fixing an error, or removing a feature after a reporter asks a question is pretty standard at many companies I cover. Did I personally cause the temporary shortage of Asians generated by his AI? Was it just a coincidence of timing? Is Meta working on solving the problem? I wanted to know, but Meta didn't answer or explain my question.
Regardless of what's going on at Meta HQ, there's still work to be done. Prompts like “Asian male and white female” now return images, but the system still messes with race, making them both Asian like yesterday. So I think we're back to basics. I'll keep an eye on things.
Screenshot by Mia Sato / The Verge