Why racism also plays a role in technologies and racial literacy can be part of the solution
The International Day against Racism
21 March was International Day against Racism. The aim is to draw attention to the fact that racism is not a problem of the past, but happens every day. Noticed or unnoticed. As a tech company, we asked ourselves two questions:
- What role does technology / programming actually play in the context of racism?
- What solutions are currently being discussed?
In the process, we came across the concept of "racial literacy" (ethnic competence), which we would like to present to you as part of a solution.
What algorithms and racism have to do with each other
Laura Schelenz and Prof. Regina Ammicht Quinn from the University of Tübingen explain it this way:
"Algorithmic systems are computer models that are trained to solve a certain task using data sets. Often these datasets themselves have biases, for example by having a high proportion of data on white people but only a low proportion of information on people with dark skin." (1)
But where can this be found concretely in everyday life? Here are three examples.
Example 1: Facial recognition works. If you are a white male.
In the New York Times, Steve Lohr (2) showed that recognition technologies reliably recognise faces when they are white. Why is that? The machines were mainly fed with data from white (80 per cent of participants), male (75 per cent of participants) test subjects. The algorithm therefore had a recognition rate of 99 percent for white men, but only 65 percent for black women.
In concrete terms, this leads to racial profiling according to Lohr, i.e. the classification of a person on the basis of their skin colour or other physically distinctive features. Brenda Medina and Thomas Frank described something similar elsewhere (3). For example, hairstyles that are predominantly popular among black people are classified as a risk by body scanners. This, in turn, is reflected in more body checks. The same applies to wigs and turbans.
Assistant Professor of Sociology Peter Hanink examined data and statistics from the New York Police Department (4). He found that skin colour and poverty, along with neighbourhood crime rates, had a significant impact on who would be investigated and when.
Example 2: A Harvard professor as a criminal?
Professor of Government and Technology Practice Latanya Sweeney was alerted by a colleague that when she googled her name, an advertisement titled "Latanya Sweeney. Arrested?" would appear. However, the researcher had never been arrested. She then examined the advertising algorithm and found that names typically associated with black communities were very likely (81 to 86% of cases on Reuters and 92 to 95% on Google) to receive a criminal history ad. The same did not happen for names with white associations (5).
Example 3: How the internet turned a chatbot into a racist in under 24 hours
Senior reporter James Vincent reports on the artificial intelligence-based chatbot unveiled by Microsoft in 2016 (6). The idea was that the robot would get smarter the more you talked to it. In more than 96,000 tweets, the robot interacted with other people on the internet, was fed data and information and reproduced it based on the user behaviour of others. Within 15 hours, the bot began tweeting contradictory messages; within 24 hours, racist, misogynistic remarks and hate speech occurred, Vincent said.
Racial Literacy as a way out of the crisis
Racial literacy means becoming aware of old patterns and prejudices and breaking them down. In 2019, three researchers in Data & Society's Fellowship Programme presented the findings of their work "ADVANCING RACIAL LITERACY IN TECH". Racial literacy is said to offer an innovative way to reduce the damage of racial prejudice.
According to the authors, three things are needed for this change:
- An awareness and intellectual understanding of how structured racism works in algorithms,
- the emotional intelligence to resolve racist situations within organisations, and
- the conviction to want to take actions that reduce the harm to black communities.
What does this mean concretely for (tech) companies?
Co-author Mutale Nkonde gave an interview to Der Spiegel in 2019 on the topic of "Artificial Intelligence - How much racism is in algorithms?" (8). When asked whether companies also want to change from within, for example by promoting black young talent, she said:
"In terms of recruiting, I'm very disappointed. I went through Google's machine learning team and found only one black man and one black woman - out of 893 employees. There are hardly any black people involved in artificial intelligence development and research, the quotas of black employees at tech companies are low."
Yet there are approaches that work. Researchers Paul Gompers and Silpa Kovvali already demonstrated a so-called "diversity dividend" in a 2018 study (9). They were able to prove that teams benefit from diversity on the basis of hard financial indicators such as turnover and profitability. Rocio Lorenzo and Martin Reeves also showed in "How and Where Diversity Drives Financial Performance" (10) that work environments with more diversity show better financial results. Yet many technology companies lag behind in implementing such desirable standards.
As wunschlösung, we too can step it up a notch. So if you're Java or Angular devs or know anyone else who can do something you think we could use - get in touch! What you look like, where you come from or what your orientation is doesn't matter for wunschlösung. You just need to have a passion for what you do. You can find more information about vacancies here.
Conclusion
Racism is (still) a problem in technology as well. To change this, people of all ethnicities and skin colours must be more actively included in production and administrative processes. Awareness must be raised about systematic inequality and how to deal with it. Active steps must be taken to develop and apply less biased procedures in the future.
If you want to delve deeper:
1) Laura Schelenz, Prof. Regina Ammicht Quinn “Black Lives, Trans Rights und Algorithmen”
2) Steve Lohr, “Facial Recognition Is Accurate, If You’re a White Guy,” The New York Times, June 8, 2018, sec. Technology
3) Brenda Medina and Thomas Frank, “TSA Agents Say They’re Not Discriminating Against Black Women, But Their Body Scanners Might Be,” ProPublica, April 17, 2019
4) Peter Hanink, “Don’t Trust the Police: Stop Question Frisk, Compstat, and the High Cost of Statistical Over-Reliance in the NYPD,” Journal of the Institute of Justice and International Studies, 13 (2013): 99
5) Latanya Sweeney, “Discrimination in Online Ad Delivery,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, January 28, 2013)
6) James Vincent, “Twitter Taught Microsoft’s Friendly AI Chatbot to Be a Racist Asshole in Less than a Day,” The Verge, March 24, 2016
7) Jessie Daniels, Mutale Nkonde, Darakhshan Mir, “ADVANCING RACIAL LITERACY IN TECH”
8) Mutale Nkonde (im Interview mit dem Spiegel) „Künstliche Intelligenz – Wie viel Rassismus steckt in Algorithmen?“
9) Paul Gompers and Silpa Kovvali, “The Other Diversity Dividend,” Harvard Business Review, July 1, 2018
10) Rocio Lorenzo and Martin Reeves, “How and Where Diversity Drives Financial Performance,” Harvard Business Review, January 30, 2018
More from our blog
Werkstudentin Interview Lena
Today in an interview:
Lena - wunschlösung back office supporter, team event planner, feel-good manager
Research & development project: Online marketplaces and platforms of the future
What will marketplace & platform business models look like in the future? Find out more in our research & development project co-financed by the EU and the state of Thuringia
Team Interview Wojciech (Frontend-Dev)
Today in the team interview: Wojciech - pixel perfectionist, bug hunter, protector of scalability
Day 2 MACHN 2024: Innovation and networking in Leipzig
Experience the highlights of the second day of MACHN 2024, from exciting workshops and inspiring presentations to practical insights into current trends and technologies.
Day 1 MACHN 2024: Innovation and networking in Leipzig
Experience the highlights of the first day of MACHN 2024, from exciting workshops and inspiring presentations to practical insights into current trends and technologies.
Bye bye #localgutscheining - A review of our voucher portal project as part of the #JenaVsVirus hackathon.
Learn more about #localgutscheining here: how our project brought Jena together in the Corona Lockdown and supported local businesses.
The difference between online marketplaces, shops, portals, platforms and stores
Find out here what exactly the difference is between online marketplaces, shops, portals, platforms and shops.
We make our way to the Start-Up Festival 2023
Our Business Development Team was on the road again last week - at the MACHN Start-Up Festival for Tech, Business and Art in Leipzig. Here are some insights!
What is an online marketplace and when does it make sense?
Find out what online marketplaces are and when it makes sense to have your own. We will help you set up your own marketplace and provide you with the right software!
Team Interview Sebastian (Project Management)
Today in the team interview:
Sebastian - Project Juggler, Keeper of the Overview and Agile Master
How do you keep the development costs of your software project under control?
Learn how to control the development costs of your software project. Get tips on project setup, prioritization, scoping and more.
What billing models are there for software projects?
Would you like to know which billing models exist for software? We show you which models exist and give you tips on how to choose.