How AI Algorithms Are Taking Over Employee Management

October 5, 2020 - 8 minutes read

artificial intelligence app developmentArtificial intelligence (AI) has become a major part of our daily lives. We interact with algorithms on social media, we get emails and ads depending on our purchases and browsing behavior, and we even get matched with a driver for Uber or Lyft through this smart technology.

Though, whether or not algorithms are good for employees is another question. When AI manages employees and their compensation, sentiments can quickly turn negative. At a growing number of companies, self-learning AI algorithms are being used to help hire, measure productivity, set tasks and goals, and, perhaps worst of all, even terminate employees.

AI Is Already at Work

When AI algorithms are given the responsibility to make and execute decisions that affect employees, it’s called “algorithmic management”. Many employees who’ve experienced algorithmic management attest that it’s impersonal and can entrench a company in pre-existing biases. It also deepens the power imbalance between management and an employee.

More than 700 companies have tested AI algorithms to score an applicant’s success in a job interview. HireVue is an AI development company that uses algorithms to score applicants on their language, facial expressions, and tone. The company claims that they speed up the hiring process by 90%.

artificial intelligence app development

For employees who want to challenge algorithmic management, it’s next to impossible because the algorithm’s code is a secret: the decision-making process and analysis are hidden. Unsurprisingly, it can be frustrating to see input go in and output come out without any explanation as to how the algorithm reached its conclusions. It’s also difficult to reach someone who could give you access to scrutinize or understand the algorithm, and even if you did find someone to help, it’s likely that they would be legally bound not to.

AI’s Impact on the Gig Economy

The gig economy companies already treat their workers like contractors and not employees, so it’s no wonder that new technology would be tested on gig economy workers before actual full-time employees. Companies like Uber, Lyft, and Deliveroo use machine learning algorithms to allocate, evaluate, monitor, and reward work for their contractors.

But this can spiral out of control quickly. In the past year, as the pandemic reached full strength, Uber Eats’ workers complained that there have been unexplained changes to the algorithm, which have resulted in lower incomes and fewer jobs to complete. However, contractors can’t be 100% sure that the algorithm caused this mess for them because no company that uses AI to manage the social and evaluation aspects of their employees is being transparent with their code. No employee can even guess as to how much an algorithm is controlling their well-being and income.

artificial intelligence app development

In an interview with 58 food-delivery workers, most knew that their gigs were being allocated by an algorithm, and they knew the app collected data to use later on. But they didn’t know exactly how the data was being used to give them more or fewer gigs. To try to game the algorithm, many workers tried to create strategies to get more jobs, such as taking them on as fast as possible and waiting in “magic locations.” A problem arose, however: Noone is working gigs to get forced to stay in one location to find more jobs. The flexibility benefit of the gig economy goes out the window when you have to wait all day at a “magic location.”

Ingrained Biases and Issues

Research has shown that AI algorithms are deeply biased, mostly because they’re developed and tested by white men, which introduces bias against women and people of color into the algorithms they’ve coded. An unforgettable example is the COMPAS software used by parole officers, U.S. judges, and probation officers to rate the risk level of a citizen in re-offending. A 2016 investigation by New York City-based ProPublica concluded that the COMPAS software was incredibly discriminatory because it had incorrectly classified black subjects as a higher risk 45% of the time, compared with just 23% for white subjects.

Because algorithmic management gives executives insight into the inner workings of the algorithm while hiding every detail away from employees that are affected by it, there are two impactful effects. It entrenches systemic biases, perpetuating the exact type of discrimination that the COMPAS software exposed years ago. And it creates a wider power imbalance between workers and management. For example, when Uber Eats couriers asked corporate to give them reasons for why their gig count was lower than normal, Uber told them that they “have no manual control over how many deliveries you receive.”

artificial intelligence app development

In the Australian state of Victoria, where Amazon workers in Melbourne are being timed to scan items by an AI algorithm, the government wrote, “The absence of concrete evidence about how the algorithms operate makes it hard for a driver or rider to complain if they feel disadvantaged by one.” The report also noted that it’s difficult to confirm if there is real “concern over algorithm transparency.”

But that’s the point, isn’t it? There is no list of employees concerned about algorithmic transparency, and because there aren’t any explicitly written articles by employees all over the world complaining about this lack of transparency, it’s difficult to start coming up with a solution for it.

Looking into Algorithms

Until the transparency, impact, and effects of algorithmic management are closely studied and researched, it is imperative that we take this technology with a grain of salt. As with any new tech application, we must make tweaks to improve it and be prepared to shut down the technology if it proves to be destructive to the well-being of others. Without humans providing oversight over this technology, we won’t be able to develop safe automation of traditionally human-centered tasks such as employee management. If we can’t provide safety and support nets for each other, no machine will ever be able to.

What do you think of using AI algorithms for employee management? Have you dealt with this personally? As always, let us know your thoughts in the comments below!

Tags: , , , , , , , , , , , , , , , , ,