As part of recent layoffs affecting 10,000 employees, Microsoft also fired the entire ethics and society team in the artificial intelligence development unit.

According to current and former employees, the move leaves Microsoft without a dedicated team to ensure that responsible AI principles are closely tied to product design as the company tries to make artificial intelligence tools available to more users.

Microsoft still has an active Office of Responsible AI, whose task is to create rules and principles governing the company’s initiatives in the field of artificial intelligence. The company says its overall investment in responsible employment is growing, despite recent layoffs.

Employees noted that the ethics and society team has played a critical role in ensuring that the company’s principles of responsible AI are reflected in the design of the products it makes.

In recent years, the team developed a role-playing game called Judgment Call, which helped designers imagine the potential harm AI could cause and discuss it during product development. It was part of a larger “responsible innovation toolkit” that the team made publicly available.

More recently, the team has been working to identify the risks associated with implementing OpenAI technology in all of its products.

The Ethics and Society team was the largest in 2020, with around 30 employees, including engineers, designers and philosophers. In October, the team was reduced to 7 people as part of the reorganization.

“It’s not that it’s going away — it’s that it’s evolving,” said John Montgomery, corporate vice president of artificial intelligence at Microsoft. It’s evolving toward putting more of the energy within the individual product teams that are building the services and the software, which does mean that the central hub that has been doing some of the work is devolving its abilities and responsibilities.”

Some employees say the move creates a fundamental gap in the user experience and holistic design of AI products.

“The worst thing is we’ve exposed the business to risk and human beings to risk in doing this,” they explained.

Members of the ethics and society team said they generally tried to support product development. But, they say, as Microsoft focused on delivering AI tools faster than its competitors, the company’s leadership became less interested in the long-term thinking that the team specialized in.

On the one hand, Microsoft may now have a once-in-a-generation chance to gain a significant lead over Google in search, productivity software, cloud computing and other areas. where the giants compete. After relaunching Bing with artificial intelligence, the company told investors that every percent of market share it could take from Google in search would bring it $2 billion in annual revenue.

This one explains why Microsoft invested in OpenAI and is now trying to integrate the startup’s technology into every corner of its empire.

On the other hand, everyone involved in AI development agrees that the technology carries powerful and possibly existential risks, both known and unknown.

The elimination of the ethics and society team comes just as the rest of the group is focusing on perhaps its biggest challenge: predicting what will happen when Microsoft releases tools based on OpenAI, for a global audience.

Last year, the team wrote a memo detailing the brand risks associated with Bing Image Creator, which uses OpenAI’s DALL-E system to create images based on text prompts. Image Creator was launched in several countries in October, which became one of Microsoft’s first public collaborations with OpenAI.

While text-to-image technology has proven wildly popular, Microsoft researchers correctly predicted that it could also threaten artists’ livelihoods by allowing anyone to easily copy their style.

Microsoft researchers created a list of mitigation strategies, including preventing Bing Image Creator users from using the names of living artists as clues and creating a marketplace to sell the work of artists that would be surfaced, if someone searched for their name in a search.

Company officials say none of these strategies were implemented, and Bing Image Creator was launched in test countries anyway.