For Amazon, it was the “holy Grail” of recruiting. A computer program to which you are submitting a hundred resumes and that offers you those that it deems the most interesting, leaving to humans the sole responsibility of the latter choice. The giant of the online sale was developed in 2014 and then tested before realizing a year later that the system had a big flaw : it rejected the candidate.

The CV that contain the word “woman” were in fact eliminated by the program, according to the sources of the agency Reuters. For example, a candidate for a “captain of chess club” women in his free time could be considered eliminated, so that the words often used by the men were, however, valued. The candidate graduates from two universities of women have also seen their note, assigned by the system, lowered for no reason. Amazon has detected the problem in 2015 and has corrected, but has abandoned the project at the beginning of 2017, taken doubt on the possibility that the system discriminates against them on other criteria. The reason for this failure undermines the opportunity to subcontract such tasks to the artificial intelligence.

The lack of diversity in question

Before receiving the new CV, the program was in effect absorbed all of the nominations received by Amazon in the ten years prior to its conception, explains Reuters. Among these, a majority had been sent by men, which had pushed the program to consider the candidate as more suitable candidates. In other words, the lack of diversity in the sector of new technologies, the result of a set of processes constructed, has been absorbed by the artificial intelligence. Recruiters from Amazon were waiting for a selection coolly objective, but have ended up with a result that is fed by a culture that is unequal.

This is not the first time that the lack of diversity of the giants of new technologies has a negative influence on their tools or products. Three years ago, an American complained on Twitter have been called “gorilla” by Google Photos. The class all single images in folders according to what she thinks to see, explained to Le Figaro. Instead of seeing two people in a photo developer and a friend, both black, the application was believed to recognize two monkeys, a comparison of racism on more than one title.

Why such an error ? As in the case of Amazon, the artificial intelligence learns through this that were submitted to him. However, 60% of Google employees are white, noted the journal, which put forward the idea that the team had not fed the system faces a well-diversified, because they have taken into account the variety of users of the product. The firm of Mountain View was also defended by referring to the “strong contrast” of the photo, wrote Le Figaro.

serious consequences

The companies learn from their mistakes ? Despite the failure of his first attempt, Amazon persevered in the recruitment automated, this time giving “more importance to diversity,” said a source to Reuters. Last July, the specialized site The Verge has asked several companies, including Google, on their efforts to combat prejudice in the area of facial recognition. “We regularly test our models so that they do not reproduce prejudice and to be fairer,” replied the company, without saying more.

Read our complete file

The face of (very) black of artificial intelligence The artificial intelligence makes our babies The robomobile will it replace the automobile?

For companies that develop these programs, the challenge is, however, in size, at the time of their commonality in the areas of recruitment, health and even justice. Last month, the site Quartz evoked the case of a start-up in Toronto at the origin of a program of detection of Alzheimer’s disease through the voice, which worked with speakers of canadian. In 2016, the website ProPublica revealed that a program for the prediction of the recurrence was attributed more often this risk to black offenders. Evidence that an artificial intelligence that works must first be cleared of the prejudices of the society in which it was created.

LEAVE A REPLY

Please enter your comment!
Please enter your name here