Amazon drops secret AI recruiting tool that showed bias against women

FILE PHOTO: The logo of Amazon is pictured inside the company's office in Bengaluru, India, April 20, 2018. REUTERS/Abhishek N. Chinnappa/File Photo - RC1E5AEBF0B0

Bias in machine learning may be problematic even for tech giants like Amazon who are at the forefront of this emerging field.  A new report from Reuters shows the technology giant had to get rid of an internal project used to vet job applications after the recruiting tool displayed an inherent bias against women.

The team responsible for this tool had been working on computer projects since 2014 to analyze job applicants’ resumes. Amazon’s hiring tool used artificial intelligence to give candidates scores from one to five, similar to how online shoppers rate products on the site.

“Everyone wanted this holy grail,” one individual familiar with the project told Reuters. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

AMAZON SENT 'CHARITY' DONATIONS TO GROUP LED BY 'HARDCORE ISLAMIST,' REPORT SAYS

A source familiar with the company's thinking said the project "was only ever used in a trial phase, never independently, and never rolled out to a larger group," adding that the project was abandoned for several reasons, including its inability to return strong candidates for the roles.

A year later, the Seattle company realized the new system was not rating candidates in a gender-neutral way. The system penalized resumes that included the word “women’s,” and it downgraded graduates of two all-women’s colleges, according to the report.

While Amazon was able to edit the program to make it neutral to these terms, there was no way of knowing if the machines would still be discriminatory.

Amazon ultimately decided to scrap the team last year because executives no longer had hope for the system, according to Reuters.

An Amazon spokesman declined to comment to Fox News for this story.

Over the past few years, artificial intelligence has been used in many contexts such as predicting when you will die  or in national defense.  Though it may be deemed useful in these circumstances, prejudices about gender and race can still find their way into a range of AI programs, forcing tech companies to tweak their facial-recognition software that may appear seemingly neutral.

DHS BACKS APPLE, AMAZON DENIAL OF CHINA 'SPY' CHIPS REPORT

Despite this, startups working on AI recruitment tools still market their products as a way to avoid bias. Amazon is reportedly thinking about this concept too, as Reuters said the company is now giving automated recruitment another try by focusing the program on diversity recruiting.

Load more..