Google Denies Use of AI Algorithms for Laying Off Employees

Nishita Gupta
Nishita Gupta February 25, 2023
Updated 2023/02/25 at 8:36 AM

According to WaPo, Google has denied employing AI algorithms and claimed that “no algorithm is involved” in its decision-making about Google laying off employees. Clearly, Google denies the use of AI algorithms

Whether it’s true or not, the workers have a lot of reasons to be suspicious. The newspaper cited a recent survey that found that 98% of human resources leaders at American businesses said they will use software and algorithms to “reduce labour costs” this year. However, only half of these leaders were confident that the technology would provide recommendations that were objective.

It is the darker side of a long-held custom. Joseph Fuller, a professor of management practice at Harvard Business School, told WaPo that the “right person” for “the right project” is frequently found by algorithms in large companies’ HR departments.

The technology contributes to the creation of a database that is referred to as a “skills inventory.” This database compiles a comprehensive list of each employee’s skills and experiences and aids businesses in determining whether these will be sufficient to accomplish their objectives.

Google denies the use of AI algorithms

Fuller stated, “Suddenly, they are just being used differently because that’s where people have a real inventory of skills.”

Take, for instance, the company Gloat: an “Artificial Intelligence Talent Marketplace” that connects employees to projects that are more relevant to them and vice versa using AI. While acknowledging the need for transparency from HR leaders, Gloat vice president Jeff Schwartz told WaPo that he is unaware of any clients using it to lay off employees.

These technologies might look at employee performance the most, but many other metrics are less clear, like “flight risk,” which predicts how likely someone is to leave the company.

According to Brian Westfall, an analyst at the software review site Capterra, AI software could inadvertently identify non-white workers as a “flight risk” and recommend firing them at a higher rate if, for instance, a company has a discrimination problem that causes non-white workers to leave at a higher rate on average.

For more such content, keep reading @techinnews

Share this Article