Google has apologized for what it describes as “inaccuracies in some historical images” created by its Gemini artificial intelligence tool. According to the tech giant, its attempts to create a wide range of results failed.

The statement came after criticism that the tool portrayed groups of people who were definitely white, such as German soldiers during the Nazi era, as people of color, which is an over-correction of racial bias issues.

“We’re aware that Gemini is offering inaccuracies in some historical image generation depictions. We’re working to improve these kinds of depictions immediately. Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here,” says the statement on official Google Communications account on Х.

Gemini results for “a German soldier from 1943” featuring illustrations of what appear to be a white man, a Black man, and an Asian woman.

It’s not the first time users have pointed out that Gemini is “somewhat inaccurate” in its images. For example, here are some of the results of user queries that contradict today’s history.

At the moment, Gemini seems to be simply abandoning some image creation tasks. It didn’t generate images of Vikings, Nazi soldiers, or even U.S. presidents in the 1800s for one of Verge‘s reporters.

Earlier, it was reported that Bing users created an image of the 9/11 attack. But the plane was piloted by cartoon characters.