Is AI Ready to Recruit Job Candidates? LinkedIn and Amazon Have Different Opinions

October 17, 2018 - 8 minutes read

These days, each job opening garners hundreds of applications. Sifting through resumes and cover letters often leads to recruiter burn-out, which can directly affect the company’s incoming talent pool.

To help, artificial intelligence (AI) app developers at LinkedIn and Amazon are working on automating some of the day-to-day work of recruiters. So far, their results couldn’t be more different: one of these companies developed an AI application that was biased, while the other managed to avoid building inherent sexism into their application.

LinkedIn AI Applicant Finder and Tracker

LinkedIn took their AI-enabled recruiter a step further by building in applicant tracking features and a talent finder. The talent finder helps companies find candidates that are qualified and bring gender diversity as well as a variety of experiences. These features will be rolled out in the company’s Talent Insight tool made for recruiters.

The AI tool effectively removes biases in data before analysis, resulting in the return of candidates with different backgrounds. LinkedIn will also now track the hiring process; companies can see analytical insights on how job posts and InMail are performing in garnering the attention of female applicants. Additionally, the company re-ranks search results in the LinkedIn Recruiter tool to show more women and non-binary candidates.

It may sound like LinkedIn is building in a bias against men in their results, but that’s not really the case. John Jersin is the VP of Product Management for the LinkedIn Talent Solutions product. He explains how search results will display: “Say I search for an accountant, and there are 100,000 accountants in the city I’m looking at. If the gender breakdown is 40-60, then what Representative Results will do is that no matter what happens in our AI, the top few pages will have that same 40-60 gender breakdown.”

Removing Bias Against Race and Gender

Many companies in the tech space are having difficulty finding, recruiting, and retaining female tech employees. So far, progress has been too slow to make a real impact on women in tech. However, studies report that diversified teams bring higher profits, more concentrated work, and higher levels of innovation.

But when you introduce another human into the hiring process (i.e. the recruiter), that person introduces an additional layer of inherent, unconscious bias. For examples, recruiters can give preference to candidates that remind them of themselves; if the recruiter believes some stereotypes, that could also influence the candidate pool.

Because AI is more of a machine, ideally, the bias wouldn’t be introduced if it’s trained correctly.

Jersin elaborates, “Unconscious bias occurs in a split second. When you look at someone for the first time, you don’t even realize you’re forming an opinion in a certain way. As [hiring managers] are shifting to use artificial intelligence to help hiring decisions, we lay out those decisions in the data and the code.”

Tracking Females Throughout The Entire Process

From finding applicants to comparing with industry diversity data, LinkedIn is empowering companies with gender breakdowns in every step of the recruitment process.

For now, LinkedIn has no plans to add more diversity features into its current AI tool; gender will remain the only diversity element tracked. To help with data collection, the company will acquire Glint, a tech startup and platform that measures employee engagement through surveys. LinkedIn hopes that Glint’s insights can help companies retain talent better.

“We’re developing a new level of artificial intelligence that’s improving efficiency in our product. We’re ensuring that it’s working in a fair representative way. We want to make sure that we’re taking steps to help our customers with diversity,” adds Jersin.

Amazon’s Fight Against an Inherently Sexist AI

Amazon had the same types of aspirations as LinkedIn. But the Seattle-headquartered tech giant had one major flaw in its implementation: it trained its AI on mostly male resumes. As a result, their candidate finder prioritized mostly men and pushed female resumes to the bottom of the pile.

Although the company stopped work on the project in early 2017, it didn’t stop the media from reporting on the sexist AI a few weeks ago. In 2014, Amazon tasked an engineering team in Edinburgh, Scotland to automate recruitment as much as possible. The engineers developed 500 models to “read” over past candidates’ resumes in search of a match for any of 50,000 specific terms.

An unnamed person close to the project detailed the long-term plan for the recruiter AI: “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

You’d expect the AI to remain slightly unbiased, but it got caught in a feedback loop wherein it began to become more and more sexist against women’s resumes. For example, it automatically removed resumes from consideration if the education listed was one of two women-only colleges. And if your resume contained the word “women’s”, you could expect to be cut from consideration.

A Failed Test

Amazon’s engineering team attempted to fix the AI, but after the first fiasco, the team began questioning if the AI would find new ways to hold biases over candidates.

However, the AI developed by Amazon wasn’t just sexist. It also couldn’t identify candidates that were actually suited for the job; it routinely recommended applicants who weren’t qualified. When an algorithm develops a bias from improper training or from its human developers, relying on the AI obviously isn’t prudent.

Some sources say that Amazon recruiters “never used the AI tool to evaluate candidates”, while others claim Amazon’s recruiters used the AI’s results to inform the hiring process.

It’s interesting to see how Amazon’s AI failed, while LinkedIn’s AI is flourishing and ready for public use. While most AI developers know how easy it is to create a biased algorithm, we hope this example of two tech companies further underscores the importance of developing AI with a diverse set of data.

Tags: , , , , , , , , , , , , , , , , , , , ,