Days after mass layoffs cut 12,000 jobs at Google, hundreds of former employees have gathered in an online chat room to commiserate.

They swapped theories about how management decided who to cut. Could a “mindless algorithm carefully designed not to violate any laws” choose who gets cut, one member asked in a Discord post, writes The Washington Post.

Google says there was “no algorithm” involved in their cuts. But former employees have their suspicions, as artificial intelligence tools become more and more entrenched in office life.

HR managers use machine learning software to analyze millions of pieces of employment-related data to make recommendations on who to interview, hire, promote, or retain.

But as Silicon Valley evolves, the software likely has a more difficult task: helping decide who to lay off, HR analysts and workforce experts say.

A January survey of 300 HR leaders at US companies found that 98% of them believe that software and algorithms will help them make layoff decisions this year. And since companies are laying off a large number of people, the number of layoffs reaches five figures, it is difficult for people to cope with this alone.

According to Joseph Fuller, a Harvard Business School professor who co-directs the Managing the Future of Work initiative, large firms, from tech giants to consumer goods companies, often use software to find the “right person” for the “right project”.

These products create a “skills inventory” — a powerful employee database that helps managers identify which work experience, certifications, and skill sets are associated with high performance in various positions.

Human resource companies have also taken advantage of the artificial intelligence boom. Firms like Eightfold AI use algorithms to analyze billions of pieces of data pulled from online career profiles and other skills databases, helping recruiters find candidates whose applications might not otherwise appear.

Since the 2008 recession, HR departments have become “incredibly data driven,” says Brian Westfall, senior HR analyst at Capterra, a site that helps businesses choose software. Turning to algorithms can be especially handy for some HR managers when making difficult decisions like layoffs, he added.

Many people use software that analyzes performance data. 70% of HR managers in a Capterra survey said performance is the most important factor when evaluating who to fire.

Other metrics used to fire people can be less clear-cut, Westfall says. For example, HR algorithms can calculate which factors make a person “flight risk” and more likely to leave the company.

According to him, this causes numerous problems. If an organization cultivates a culture of discrimination, for example, people of color may be fired more often, but if the algorithm is not trained to know this, it may consider such workers to be a higher “flight risk” and suggest more layoffs, he added.

Jeff Schwartz, vice president of AI-powered HR software company Gloat, says his company’s software works as a recommendation engine, similar to how Amazon recommends products. This helps companies decide who to interview for open positions.

He doesn’t think Gloat customers use the software to create layoff lists. But he acknowledges that HR leaders need to be transparent about how they make such decisions, including how widely algorithms are used.

The reliance on software has sparked a debate about what role algorithms should play in putting people out of work and how transparent employers should be about the reasons that lead to job losses, workforce experts said.

“The danger here is using bad data,” said Westfall, “[and] coming to a decision based on something an algorithm says and just following it blindly.”

But HR organizations are “overwhelmed since the pandemic” and will continue to use software to lighten their workloads, says Zach Bombatch, a labor and employment attorney and member of Disrupt HR, an organization that tracks advances in human resources.

With this in mind, companies should not allow algorithms alone decide who to cut and must review proposals to make sure they don’t discriminate against black people, women, or the elderly — which could lead to lawsuits.