Microsoft engineer Shane Jones discovered that the AI tool Copilot Designer based on OpenAI technology creates violent images and ignores copyrights, reports CNBC.

Shane Jones has been working at Microsoft for six years and is now the general manager of software development at the corporation’s headquarters in Redmond.

According to him, he was not involved in the development of Copilot Designer, which was originally called Bing Image Creator. But he joined the so-called “red team” and tested the AI tool for potential vulnerabilities.

He found that it was generating images that ran counter to Microsoft’s responsible AI principles. For example, Copilot Designer depicted demons and monsters alongside terminology related to abortion rights, teenagers with rifles, sexualized images of women in violent scenes, and underage drinking and drug use.

The engineer said that he had reported the issue internally in December. Although Microsoft recognized Jones’ concerns, they did not want to withdraw the product from the market.

Given the situation, the engineer published an open letter on LinkedIn, but the company’s legal department asked him to delete his post. He did so. But in January, he wrote a letter to U.S. senators and later met with staff from the Senate Committee on Commerce, Science, and Transportation.

He also recently sent a letter to the chairman of the US Federal Trade Commission, Lina Khan, and another to the board of directors of Microsoft. In the letter to Lina Khan, he noted that he had repeatedly urged the company to remove the tool from public use until better protections were in place.

Since Microsoft has “refused this recommendation,” he urged the tech giant to add product information and change the rating on the Android app to clearly state that it is intended for mature audiences only.

In a letter to Microsoft’s board of directors, the engineer asked that the company’s Environmental, Social and Public Policy Department investigate certain decisions of the legal department and management, and launch “an independent review of Microsoft’s responsible reporting processes for artificial intelligence incidents.”

We would like to remind you that this is not the first scandalous case surrounding the company’s AI tool. For example, last year, users were able to generate images that recreate the September 11, 2001 tragedy in the United States.

Numerous images created with the help of AI show the cockpit of a passenger plane flying towards the towers of the World Trade Center complex in New York. However, the driver is Mickey Mouse, SpongeBob, or the video game character Kirby.