Robots also distinguish between sex and race

Experts warning robots with artificial intelligence are gradually changing people to decide many things, but they also have behaviors that distinguish gender and race.

Researchers found many computer and robot programs that behave sexually, racially like humans.

Experts warning robots with artificial intelligence are gradually changing people to decide many things, but they also have acts of gender discrimination and race , Telegraph reported on August 24.

Picture 1 of Robots also distinguish between sex and race

Robot instead of humans screening candidates based on gender and race.(Photo: Occupy Corporatism).

The new study shows robot programs designed to classify candidates for schools or consider insurance conditions, bank loans can discriminate against women or people of color.

Developed to filter candidates for a medical school in the UK, this computer program has automatically eliminated candidates who are women, black people and ethnic minorities.

Researchers at Boston University have demonstrated bias in the algorithms of artificial intelligence by training a content analysis machine from Google News. When asked to complete the sentence "Men are to be computer programmers, women are to ." , the machine responded "to be a housewife".

Another American computer program is required to check photos on social networks. To the picture of a man standing in the kitchen, the machine still identified it as a woman.

Picture 2 of Robots also distinguish between sex and race

Professor Noel Sharkey warns about gender discrimination and race in robots.(Photo: PA).

According to Professor Noel Sharkey, director at the Foundation for Responsible Robotics, Deep Learning algorithms dominate the software of artificial intelligence, which is not fair because most IT engineers are now men. . Women currently account for approximately 9% of the engineering workforce in the UK and only 20% of all A-Level Advanced Physics programs.

The problem is primarily due to the erroneous data errors that robot platforms are using, according to leading expert data on Maxine Mackintosh.

"These data are the mirror of society - they reflect bias and inequality in the community," she told the BBC. "If you want to change that, you can't just use the old information."

According to a survey by ProPublica last year, the COMPAS program used by a US court to assess the risk of seeing black prisoners in a negative way. It tends to misjudge that black prisoners are more recidivist than white prisoners.

Update 12 December 2018
« PREV
NEXT »
Category

Technology

Life

Discover science

Medicine - Health

Event

Entertainment