Technology

Artificial intelligence reproduces gender biases and stereotypes

Credit: Adobe Stock
Credit: Adobe Stock

The common belief that algorithms and artificial intelligence are completely objective is a misconception. Technology has always reflected the prejudices and stereotypes that exist in societies, because the databases on which it is based and learns are optimised and implemented by people, say researchers.

In a paper published in Feminist Media Studies (https://doi.org/10.1080/14680777.2023.2263659), researchers from the Koźmiński University (ALK) in Warsaw, Dr. Anna Górska and Professor Dariusz Jemielniak, describe the results of their research on gender-biased AI generators in the context of various professions and workplaces. 

Their results show that artificial intelligence is not only not free from biases, but very often reproduces them, which is manifested, for example, in the images of professionals (men and women) it generates in the fields of law, medicine, engineering and scientific research.

'Many stereotypes perpetuated by AI come from input data. If mainly Western databases are used, including, for example, magazines, TV series and books in which gender stereotypes are strong and the main characters are primarily white, it is hardly surprising that the models treat this data as a reflection of reality. The problem therefore shows that one needs to be very careful when selecting data,’ says Professor Jemielniak.

The researchers have noticed a significant overrepresentation of men in AI generators. The world of data was not created with women in mind. Meanwhile, algorithm creators import a variety of data into their models, much of which unwittingly supports sexism, racism and unsubstantiated meritocratic views. This perpetuates existing gender inequalities and segregation - for example in specialisations such as law, medicine, engineering and science.

'Who is a programmer in the IT industry? These are usually men. We are talking about the entire +brogrammer+ culture (slang term for a stereotypical male programmer - ed. PAP) among programmers, which is not very inclusive. Probably their own biases, which consciously or unconsciously are later reflected in the algorithms,’ says Dr. Górska, who specializes in gender and diversity issues in organizations and universities.

The study by the Koźmiński University team involved comparing nine popular AI image generators to check how they presented images of people in four prestigious professions (according to the Global Teacher Status Index): lawyer, doctor, engineer and scientist. In total, artificial intelligence created 99 images as part of the study, which the authors carefully analysed with the respondents.

'We entered the prompts in English. We did not use Polish because we wanted to maintain gender neutrality,’ says Dr. Górka. However, she adds, the results themselves were no longer gender neutral.

The generated images were then reviewed by 120 respondents, aged 18 to 24. Their task was to assess what gender was shown in a given image. They had four options to choose from: male, female, ambiguous and inhuman. The profession for which the AI created the image was not revealed to them.

As it turned out, significant gender biases were visible in the images created by artificial intelligence. Men were represented in 76 percent of images, women only in 8 percent. The smallest representation of women was observed among doctors - only 7%. 'Which is surprising, because according to OECD data from 2021, women (...) constitute almost half of the people working in this profession,’ says Dr. Górska. Among engineers and scientists, women accounted for 8%, among lawyers - 9%.

Dr. Górska adds that images generated by AI influence public opinion and shape social reality. Therefore, if we want to fight prejudice, changes should also affect this technology. According to the researcher, however, they cannot be limited only to representation and inclusiveness, but also to the division of tasks between men and women in the processes of design and decision-making at higher levels.

'Why is it so important that women are represented at various levels in the organization, also among programmers? Sheryl Sandberg, the former CEO of Facebook, had trouble getting from the parking lot to her office late during her pregnancy. So she went to her board of directors and said, +Listen, we need to make parking places for pregnant women+. And they replied, +Sure, no problem. We haven't thought about it before+. This is why it is so important to have women at various levels in institutions, because unless someone is concerned about a given problem, they simply will not think about it. If there were more female programmers in IT, the representation would be more equal,’ says Dr. Górska.

The expert explains that the gender biases in AI she has examined were noticeable in the vast majority of tested generators. 'This unfortunately influences how women and men are portrayed in various professions and leads to the normalization of stereotypical views and reinforces existing gender inequalities, she says.

In her opinion, it shouldn't be left unchecked. Public awareness should be raised about the current issues related to text replacement by AI image generators and AI developers should be encouraged to develop and design more inclusive and equitable technologies.

The authors from the Koźmiński University emphasise that if we neglect this, a self-fulfilling prophecy may occur. According to the researchers, by influencing people's perceptions of gender and occupation, AI models may make women and men less likely to choose careers perceived as stereotypically feminine or masculine. This may result in a spiral of gender inequality. (PAP)

Katarzyna Czechowicz

kap/ bar/ kap/

tr. RL

The PAP Foundation allows free reprinting of articles from the Nauka w Polsce portal provided that we are notified once a month by e-mail about the fact of using the portal and that the source of the article is indicated. On the websites and Internet portals, please provide the following address: Source: www.scienceinpoland.pl, while in journals – the annotation: Source: Nauka w Polsce - www.scienceinpoland.pl. In case of social networking websites, please provide only the title and the lead of our agency dispatch with the link directing to the article text on our web page, as it is on our Facebook profile.

More on this topic

  • Adobe Stock

    Mathematics will be the first field of knowledge where AI will achieve superhuman capabilities, says expert

  • Adobe Stock

    Driving children to school increases air pollution, say experts

Before adding a comment, please read the Terms and Conditions of the Science in Poland forum.