training.data   2

Amazon’s AI-powered recruiting tool was biased against women — Quartz

The team decided to train the system on the previous 10 years of resumes sent to Amazon, which were mostly men.
When the algorithm reached its conclusions for what would be good and what would be bad in an applicant, it mirrored the hiring biases towards men that Amazon had shown in the past.


Honestly, I don't think this algorithm is bad.
It successfully reflected the bias of the organization.
While this is useless as a predictive tool, it's actually profoundly important as an auditing tool.
machine.learning  fail  amazon  labor  management  diversity  failure  training.data  data  big.data  gigo  recruiting  resumes  no.fucking.shit  confirmation.bias  biases 
5 weeks ago by po

related tags

amazon  biases  big.data  confirmation.bias  data  diversity  fail  failure  gender  gigo  hiring  labor  machine.learning  management  no.fucking.shit  recruiting  resumes 

Copy this bookmark:



description:


tags: