Human

Algorithms exclude certain people, says sociologist

Credit: Adobe Stock
Credit: Adobe Stock

Algorithms increasingly support human decision-making. However, these codes often contain errors and simplifications that can lead to discrimination of certain groups, according to Dr. Kuba Piwowar, a sociologist and cultural expert from the SWPS University.

Internet algorithms propose online content that we should be interested in. Banks use algorithms to evaluate credit rating. But although it may seem these decisions are objective, this is not always the case.

During the SWPS University webinar 'Algorithms - discrimination and exclusion and their social consequences', Dr. Kuba Piwowar said that algorithms do not reflect reality 1:1. They always contain simplifications, because this is the nature of digital data. Due to these simplifications embedded in algorithms, some groups will fare better than others. He said: “It is usually worse for the people who have been marginalized so far.”

THE CONSEQUENCES OF ALGORITHMIC EXCLUSION

According to the SWPS University press release, algorithms usually work on huge data sets that are processed quickly and on a large scale. That is why their decisions affect huge groups of people.

For example, in the predictive policing system used in the USA, police units are sent to places where someone who will commit a crime can potentially appear before it happens. The goal is that the police can thwart it or catch the perpetrator in the act. The system uses historical data on detected crimes. As a result, many places where crimes have been committed but not reported by the police will be considered safer - although in fact they are not.

Another example from the United States is the COMPAS system introduced to support the decisions of judges to release convicts from prison before the end of the sentence. One of the reasons behind the implementation of this solution was that the decisions of judges were not free from errors - the decisions issued in the morning tended to be less severe, and more severe in the afternoon. According to researchers, a possible explanation for these differences is the relationship between the severity of sentences and the hunger felt by judges. 

Since people's decisions on such important matters depended on non-substantive factors, this process was considered worth supporting with algorithms that.

The goal was to introduce a solution that would make such important verdicts not depending on factors such as hunger or satiety. However, the problem was that the data on the basis of which the system operated were not representative for the whole society, because there are many more men than women, more younger people than older people and more POCs than white people in prisons in the US.

These models reflect - in a slightly distorted way - the phenomena occurring in society. For example, if you search for 'physicist', you will see photos of men in lab coats, not a photo of Maria Skłodowska-Curie or other accomplished women researchers. This is because this is what the representation of this professional group in society looks like - men predominate in it.

In turn, banks profile us based on various data available for them, for example based on our transactions. If the bank notices that you spend a given amount on your car every month, you will receive a car insurance offer. It is similar when applying for a loan. The decision is made on the basis of your profile as a borrower. However, the problem is that the operation of algorithms is non-transparent. We are not able to appeal their decisions.

According to Dr. Piwowar, the solution is education. We should teach analytics in both primary schools and high schools, give students an opportunity to become familiar with it. At the higher education level, especially in technical sciences, students should take ethics classes that will allow them to understand the dilemmas related to the functioning of algorithms.

Another important issue raised during the lecture is to introduce appropriate regulations. The European Union is leading in this regard because it introduces many ideas related to the regulation of algorithmic exclusion, for example, institutions are required to report how the data are collected, because - as mentioned earlier - the data the algorithm uses determine the analysis results.

According to a cultural expert, we should also focus on creating a sense of responsibility among people who deal with data. In the US, it is considered a profession of public trust, because the decisions their algorithms make affect the quality of life of people.

In addition, according to the conclusions from the webinar, we should pay attention to soft issues. 

Piwowar said: “We - citizens - should learn that these algorithms are around us and we should demand answers to specific questions, for example why exactly the bank asks us for certain information. 

“Perhaps we will not get an answer, but the very awareness that we can ask about it means that we are more conscious consumers, as a result of which, in the long run, products will be created in a more conscious way.”

PAP - Science in Poland

lt/ agt/ kap/

tr. RL

The PAP Foundation allows free reprinting of articles from the Nauka w Polsce portal provided that we are notified once a month by e-mail about the fact of using the portal and that the source of the article is indicated. On the websites and Internet portals, please provide the following address: Source: www.scienceinpoland.pl, while in journals – the annotation: Source: Nauka w Polsce - www.scienceinpoland.pl. In case of social networking websites, please provide only the title and the lead of our agency dispatch with the link directing to the article text on our web page, as it is on our Facebook profile.

More on this topic

  • Adobe Stock

    Gen Alpha sees generative AI as authority and oracle, research shows

  • Credit: Adobe Stock

    Music therapy supports parents of premature babies, says study

Before adding a comment, please read the Terms and Conditions of the Science in Poland forum.