How to end gender bias in internet algorithms

Algorithms (2022). DOI: 10.3390/a15090303″ width=”800″ height=”530″/>

Scopus index articles for various gender-related terms. Credit: Algorithms (2022). DOI: 10.3390/a15090303

Endless studies have been written about whether the internet algorithms we constantly interact with suffer from gender bias, and all you have to do is do a search to see for yourself.

However, according to the researchers behind a new study trying to come to a conclusion on the matter, “so far the debate has not been included in any scientific analysis.” This new paper, by an interdisciplinary team, takes a new approach to the question and suggests some solutions to avoid these biases in data and the discrimination they bring.

Algorithms are increasingly being used to decide whether to grant loans or accept applications. As the scope, capabilities and importance of artificial intelligence (AI) grow, it is increasingly vital to assess any possible biases associated with these operations.

“Although this is not a new concept, there are many cases where this problem has not been investigated, thereby ignoring the potential consequences,” the researchers said. Algorithms journal mainly focuses on gender bias in various fields of artificial intelligence.

Such prejudices can have a profound effect on society: “Prejudices affect anything that is discriminated against, excluded, or associated with a stereotype. For example, gender or race may be excluded from the decision-making process, or simply certain behaviors. can be assumed based on a person’s gender or skin color,” explained Juliana Castañeda Jiménez, a doctoral student in industry at the Universitat Oberta de Catalunya (UOC) under the supervision of Angel A. Juan, the study’s principal investigator. Universitat Politècnica de València and Javier Panadero, Universitat Politècnica de Catalunya.

According to Castañeda, “it is possible for algorithmic processes to discriminate by gender, even if they are programmed to be ‘blind’ to this variable.”

The research team – researchers Milagros Sáinz and Sergi Yanes, also from the Internet Interdisciplinary Institute (IN3) Gender and ICT (GenTIC) research group, Sarria, Assumpta Jover, Laura Calvet from the Salesian University. Universitat de València and Ángel A. Juan illustrate this with a number of examples: a well-known recruitment tool that favors male over female applicants, or the example of some loan services that offer women less favorable terms than men.

“If old, unbalanced data is used, you’ll see negative conditioning of black, gay, and even female demographics, depending on when and where the data comes from,” Castañeda said.

Science is for boys and arts is for girls

To understand how these patterns affect the different algorithms we work with, the researchers analyzed previous work that identified gender biases in data processing in four types of artificial intelligence: natural language processing and generation, decision management, speech recognition, and those describing facial features. recognition.

Overall, they found that all algorithms better identified and classified white men. They also found that they replicated false beliefs about the physical attributes that should define someone based on their biological sex, ethnic or cultural background, or sexual orientation, as well as creating stereotypical associations linking men to the sciences and women to the arts.

Many procedures used in image and voice recognition are also based on these stereotypes: cameras recognize white faces more easily, and audio analysis has problems with high-pitched voices, which mainly affect women.

The cases algorithms that suffer the most from these problems are those based on the analysis of real-life data related to a specific social context. “Some of the main reasons are the underrepresentation of women in the design and development of AI products and services and the use of gender-biased data sets,” the researcher said, arguing that the problem stems from the cultural environment. developed.

“An algorithm trained with biased data can detect hidden patterns in society and replicate them at work. Thus, if men and women are unequally represented in society, the design and development of AI products and services will show gender biases. .”

How can we stop this?

The multiple sources of gender bias, as well as the characteristics of each given algorithm and dataset, mean that eliminating this bias is difficult, but not impossible.

“Designers and everyone involved in their design should be made aware of the possibility that biases may exist related to the logic of the algorithm. In addition, they should understand and implement the measures available to minimize potential biases. To prevent them from occurring, because what happens in society is separate – if they are aware of the types of choices, they will be able to identify when the solutions they develop reproduce them,” said Castañeda.

This work is innovative because it was carried out by experts in various fields, including sociologists, anthropologists and experts in gender and statistics. “Team members provided a perspective that went beyond the autonomous mathematics associated with algorithms, thereby helping to view them as complex socio-technical systems,” said the study’s principal investigator.

“If you compare this work to others, I think it’s one of the few that presents the issue of bias in algorithms from a neutral perspective, emphasizing both the social and technical aspects of why an algorithm might make a biased decision,” he said. concluded.

Juliana Castaneda et al., Tackling Issues of Gender Bias in Information-Algorithmic Processes: A Socio-Statistical Perspective, Algorithms (2022). DOI: 10.3390/a15090303

Provided by Universitat Oberta de Catalunya (UOC).

Quote: How to End Gender Bias in Internet Algorithms (2022, November 23) Retrieved November 24, 2022 from

This document is subject to copyright. No part may be reproduced without written permission except in any fair dealing for the purpose of personal study or research. The content is provided for informational purposes only.

Source link