You are here Home » Women In Tech » The ‘Amazing’ AI recruitment tool by Amazon is very biased against hiring women

The ‘Amazing’ AI recruitment tool by Amazon is very biased against hiring women

by Milicent Atieno
Amazon just joined the exclusive Trillion Dollar Club companies joining Apple

Between 2014 and 2017, e-commerce giant Amazon has tried using an Artificial Intelligence (AI) tool to rate the best job candidates as it does with the products on its online catalogs. It gives the applicants a rating of between one to five stars, as cited by Reuters.

One year into the deployment of the AI tool, the programmers came to the realization that their ‘holy grail’ recruitment too was very biased against women. The algorithmic system running the tool was not good enough at identifying the potential of all job candidates; while the female applicants consistently got lower ratings, their male counterparts were fairly assessed.

The problem of Garbage In, Garbage Out (GIGO)

Upon in-depth study on why the recruitment AI by Amazon was consistently churning out male employees while dismissing female job candidates reveals not a problem with the algorithm in and by itself. But in the data fed into the algorithm to help it distinguish between male and female employees.

An AI system learns from the data it has been given. To make an AI to distinguish between a rat and an elephant, it will first have to learn from gigabytes of image data of the two in order to come up with an accurate algorithm to distinguish the two animals.

Such is the case with the Amazon recruitment AI in distinguishing qualifications between male and female job candidates. For the AI to work appropriately, Amazon should have fed it with historical data on the desirable job candidate for the organization, devoid of any prejudices.

However, historically Amazon has been hiring more men and women. So feeding the AI that data will automatically lead to more men than women being hired. Reuters reports that the team behind this innovative recruitment machine fed the AI with Amazon’s data from its previous 10 years records of resumes. A record of resumes from which mostly male candidates go the job.

So there is little wonder why the AI keeps churning out more and more male employees while dismissing female candidates. The AI hiring algorithm simply echoes the hiring biases it had learned from the historical hiring data from the Amazon archives.

The fact that hiring in Amazon is gender biased is evident in the company’s 2014 diversity report, which showed 63% of employees were male. The situation got worse when they were sieved down to just employees holding managerial position, which showed 75% were male.

The AI penalized the word ‘Woman’ in its review

It is further reported that the AI-powered recruitment tool penalized the word ‘woman’ on any resume it came across. That is in the context of women’s club or sport or downgraded women’s colleges to be less preferable.

When Amazon discovered these biases with its AI recruitment bot, once praised as the ‘holy grail’ and next frontier in HR, it tried to make the algorithm more neutral. However, there are no guarantees that the changes will make the hiring more gender neutral.

What does this mean with increasing use of AI in government decision and other socio-economic activities?

Today’s world seems fixated on big data and analytics through AI to make autonomous decisions. That means the future might very well be run by AI, and if their use mirrors the same historical biases humans have, what are the chances of a more neutral judgment.

Since AI needs historical data to chart out the future, yet the historical data mirrors our human biases, won’t the new tech world just be a digitizing prejudice world? This vice applies not just to gender, but also race, religion, sexual orientation, and just about any other aspect of human diversity.

To make things worse, some government functions, though currently experimental, are being delegated to AI technology. Will it not lead to the breaking of civil rights laws?

You may also like