Welcome to Leoforce!
Home Resources blog What can Amazon’s AI recruiting tool teach us about reliance on automation?

What can Amazon’s AI recruiting tool teach us about reliance on automation?

Lessons from Amazon's failed attempt at AI recruiting tool

Amazon, the technology giant that took its first steps back in 1995 as an online bookstore is now a global hub for anything and everything addressing human needs. Obviously, Amazon seems to have what can only be described as the Midas touch. The organization has seen immense success in it’s technology ventures. But that doesn’t mean that it has not seen failure. Let’s understand how and why Amazon failed in successfully delivering an AI backed recruitment tool that could have had the potential to change HR functions around the world.

What was Amazon’s AI recruiting tool?

It goes without saying that along with increased digital transformation efforts, the Covid 19 pandemic has significantly impacted and driven the recruitment process exponentially. This has consequently led to an obvious reliance on automation for supporting the mammoth task of hiring effectively and efficiently. But let’s back up a bit.

Amazon’s AI tool was launched in 2014. Its fundamental task was ranking job candidates from one to five stars in order to quickly identify top talent and stow away the rest. To achieve this objective, the tool created approximately 500 computer models that would crawl through past candidates’ resumés and pick up on about 50,000 key attributes or terms that showed up in the resumés. The model focused on specific job functions and locations with the engineers training the AI system by inputting the top performing employees’ resumés as models for what they sought after in new hires. This machine learning then matched all future resumes against the past inputted ones from the last ten years.

While on paper and in theory all this sounded great, the effort and tech machinery failed and that makes for an interesting case study.

Here are some key lessons highlighting the pitfalls from the Amazon AI recruitment tool:

Lacked reliance in screening candidates

While an AI enabled tool can be reliable in reading and screening resumes, any difference in pattern of resume formats, verbatim used and fonts, makes it difficult for the system to understand.

Additionally, the system exhibits a high probability of rejecting candidates because they don’t meet all criteria of the job description, even if their experience or skill set greatly overcompensates for the lack of a certain qualification. This leads to unoptimized resumes being ignored even if they come from great candidates. Case in point, in Amazon’s case, the tool was trained on resumes submitted over the course of ten years, which were predominantly from men. As a result, the system learned to favor resumes that contained the words and phrases commonly used by men only.

Inability to make future decisions based on past reads

The data output is as good as its input, which means that the problem was in the recency of the data. Amazon used the intelligence of the last 10 years to predict the future outcomes which in this case was a selection of candidates based on the prior understanding of high performing resumes, wherein there was a smaller pool of women candidates. The tool was unable to account for macro factors with respect to the changing times of the recruitment industry, for example policy changes, hiring trends, increasing emphasis on diversity and inclusion etc.

Owing to a smaller pool of women applicants in the past data, the algorithm failed to discriminate between acceptable and unacceptable women candidates. Given this, the algorithms ended up tagging more men candidates as acceptable.

Impersonal nature of AI

AI chatbots can answer most questions and do so quicker than human recruiters. However, often automation can bring with it the frustration of standardized responses, leading to a poor recruitment processing journey. Close to 80% of job seekers prefer human interaction to an exchange with a bot as machines have trouble processing human emotions and slang terms too.

While there is no doubt that automation and AI have transformed many aspects of the recruitment process, it is also true that human interactions remain imperative through the hiring process especially while shortlisting candidates based on their curriculum vitae. There are quite a few arguments justifying human interventions over bots through the recruitment process:

    • Personalization: A human recruiter can cut through the biases and offer a more personalized and empathetic experience by customizing the approach differently for every candidate. Something extremely critical in a professional setup. The journey and window into the culture of an organization for a prospective candidate starts from their recruitment process, thereby making it critical for HR representatives to focus on making the experience seamless.
    • Engagement: Human interactions are any day more engaging and interactive than a chatbot. For example, of the top issues quoted by prospective applicants while applying for a job on a company portal; 25% quoted the experience of filling an initial applicant form as a tedious exercise. Many employers now use applicant tracking systems to filter the influx of applications before a human sees it. Not only does it make it a lengthy process for the candidate but often comes at the cost and assumption of resumes not getting noticed, again a possible peril of AI.
    • Trust: Candidates feel more comfortable knowing that a human is reviewing their application and making decisions based on their qualifications and experience.

Inability to measure intangibles in most cases

Artificial intelligence is great for processing measurable data like years of experience and education level. However, it struggles with the qualitative attributes of an applicant like personality, soft skills, and potential cultural fit within the organization. Unless there is access to data on things like cognitive aptitude and personality tests to inform the AI system, recruiters can lose the opportunity of finding great talent.

Lessons from Amazon’s case

Machine learning and human intelligence is a power packed combination. Machines are trained by humans, and the Amazon case study is a clear-cut representation of why many tech experts voice the flaw that rather than remove human biases from important decisions, artificial intelligence often simply automates them. Idea is to use the data with an additional lens of manual intervention to identify biases, action on various changes to drive diversity and inclusivity.

Over to you

While AI recruitment tools pose multiple advantages which include screening large volumes of resumes and job applications quickly, saving recruiters time and resources; with every day learning and feedback, AI, the developers, and the recruitment industry can get better at understanding the issues and explore the needs of the entire recruitment process. It is important to test, iterate and proof check every initiative that can act as a safety net, and avoid the chance of being unjust or discriminating.

Considering the substantial positive impact AI can have on our world when used right and just, it’s equally important that companies are supported in their development of it.

Find more compatible candidates with Talent Intelligence.

Discover how Arya goes beyond conventional AI recruiting