Amazon scraps secret AI recruiting tool that showed bias against women | Reuters


39 bookmarks. First posted by brchastain 9 weeks ago.


SAN FRANCISCO (Reuters) - Amazon.com Inc’s ( AMZN.O ) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women. The…
from instapaper
9 weeks ago by yudha87
Amazon musste altes Recruiting-Tool überarbeiten, weil es Männer bevorzugt hatte
from twitter
9 weeks ago by grzbielok
Reuters' sources detail how Amazon shut down a machine learning tool for rating job applications in 2017 because it was biased against female candidates
9 weeks ago by joeo10
OCTOBER 9, 2018

The group created 500 computer models focused on specific job functions and locations. They taught each to recognize some 50,000 terms that showed up on past candidates’ resumes. The algorithms learned to assign little significance to skills that were common across IT applicants, such as the ability to write various computer codes, the people said.

Instead, the technology favored candidates who described themselves using verbs more commonly found on male engineers’ resumes, such as “executed” and “captured,” one person said.
bias  gender  hiring  algorithms  Amazon 
9 weeks ago by macloo
this is fun - Amazon's own hiring pointed out their own hiring biases. thanks machines!
AI  from twitter_favs
9 weeks ago by danmactough
Crap in, crap out: Amazon scraps secret AI recruiting tool that showed bias against women | Reuters
from twitter_favs
9 weeks ago by Arnte
RT : Amazon scraps secret AI recruiting tool that showed bias against women
from twitter_favs
9 weeks ago by coty
Jeffrey Dastin:
<p>The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars - much like shoppers rate products on Amazon, some of the people said.

“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.</p>


So more accurate to say that the AI tool <em>revealed</em> bias against women. But then kept on doing the same: it would penalise those CVs which included "women's". Eventually they realised they couldn't get it right.
amazon  ai  bias  gender 
9 weeks ago by charlesarthur
SAN FRANCISCO (Reuters) - Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women. via Pocket
IFTTT  pocket  ai  algorithm 
9 weeks ago by mgacy
Lots of great analysis coming out about the benefits and drawbacks of AI in recruiting.
from twitter_favs
9 weeks ago by Vince
RT : Amazon scraps secret recruiting tool that showed bias against women ...…
from twitter
9 weeks ago by techreckoning
It now uses a “much-watered down version” of the recruiting engine to help with some rudimentary chores, including culling duplicate candidate profiles from databases, one of the people familiar with the project said // after the allegation, the robot was called into HR and transferred to filing
ai  machinelearning  racistcomputer 
9 weeks ago by yorksranter
That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.
twig  688 
9 weeks ago by leolaporte
SAN FRANCISCO (Reuters) - Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.
gender  technology  hiring  via:ramitsethi 
9 weeks ago by eaconley
SAN FRANCISCO (Reuters) - Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.
gender  technology  hiring 
9 weeks ago by ramitsethi
Amazon trained an AI model to classify resumes based on their previous hiring. Then later they discovered that the model had negative weights on terms like "women" (eg: "women's chess club"). They finally scrapped the whole model.
ai  bias  amazon  hiring  womenintech  via:HackerNews 
9 weeks ago by mcherm
Favorite tweet: broderick

Amazon built an AI to rate job applications. It analyzed 10 years of (male dominated) hires. Then it started penalizing resumes that included the word “women’s,” downgrading graduates from all women's colleges, and highly rating aggressive language. https://t.co/j0xCBu3riC

— Ryan Broderick (@broderick) October 10, 2018

http://twitter.com/broderick/status/1050019537637318656
IFTTT  twitter  favorite 
9 weeks ago by tswaterman
Bar charts showing gender diversity among Amazon, Google, Facebook and Microsoft's global and tech workforce.
bias  amazon  hiring  algorithmic-bias 
9 weeks ago by hay
.@Amazon scraps secret #AI recruiting tool that showed bias against women: https://t.co/T3ik7G90PG via @Reuters (tip: @Techmeme )

— Jeffrey Dastin (@JLDastin)…
from instapaper
9 weeks ago by mathewi
That feeling when you retweet an interesting-looking article before realizing you're quoted a bit further down...
from twitter_favs
9 weeks ago by fedira
This story is crazy: "In effect, Amazon’s system taught itself that male candidates were preferable."
from twitter_favs
9 weeks ago by nowthis
Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry. […] Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter.’


nice demo of algorithmic bias right there. Worrying that there are plenty of other places carrying on with the concept though....
algorithmic-bias  amazon  hiring  resumes  bias  feminism  machine-learning  ml 
9 weeks ago by jm
Amazon scraps secret AI recruiting tool that showed bias against women by $AMZN
from twitter_favs
9 weeks ago by brchastain