Girl Power: Can we Break the Bias in Al and Beyond?

Two girls looking to the stars © iStock / EPFL 2022

Two girls looking to the stars © iStock / EPFL 2022

We all know the story. It gets rolled out every March when women and men around the world celebrate International Women’s Day. Human biases are well documented, from the implicit to the explicit and they have always existed but as algorithms and big data increasingly run our lives these biases become embedded in everything.

“Going back to the theory of Man the Hunter, the lives of men have been taken to represent those of humans overall. When it comes to the other half of humanity, there is often nothing but silence. And these silences are everywhere. Films, news, literature, science, city planning, economics, the stories we tell ourselves about our past, present and future, are all marked – disfigured – by a female-shaped “absent presence”. This is the gender data gap. These silences, these gaps, have consequences. They impact on women’s lives, every day.”

This is an extract from Invisible Women: Exposing Data Bias in a World Designed for Men by Caroline Criado Perez, a book that Anastasia Ailamaki a Professor at EPFL’s School of Computer and Communication Sciences (IC), and head of the Data Intensive Applications and Systems Laboratory says was very difficult for her to read, “I wanted to break whatever furniture we had in the house. I'm a woman and I'm a database person and it touched me at a very profound level.”

Most offices, the book outlines, are five degrees too cold for women because the 1960s formula to determine ideal temperature used the metabolic rate of a 40-year-old, 70kg man. Cars are designed around ‘reference man’ so women are almost 50% more likely to be seriously hurt in an accident. In 2019 a Facebook algorithm was found to be allowing advertisers to deliberately target adverts according to gender with nursing and secretarial work suggested primarily to women. The same year, claims were made against Apple’s credit card for being biased against women by offering different credit limits based on gender.

“Men are just biased against women. Women are biased against women. Everybody is biased against women. In my work, I am the only woman everywhere I go, and people say it’s because of what I do - but that’s not an input that’s an output,” continues Ailamaki. So how can we break this bias as AI becomes ubiquitous with the risk that gender equality is set back decades?

There is an obvious bias in a lot of historical data as it reflects how society was at that time and machine learning algorithms are trained with this data. Is it possible to tweak these algorithms to account for historic biases?

Assistant Professor Robert West is Head of EPFL’s Data Science Lab (dlab), that works on issues such as fairness in machine learning, “I think we don't know at this point because the digital world is a complex dynamic system. It's not like you turn a knob and then you fix the problem. It's more like a stock market and by turning the knob you change the incentives and then everything is affected, and you don't really know what comes out of it. I think it will come down to having experimental platforms in place where we can turn knobs and then see, for that affected group of people, how things changed.”

Despite the systems complexity today’s computer scientists have to navigate, there is work happening on bias mitigation, trying to optimize algorithms to not only be accurate but to be fair, however what does fair mean and how do we measure whether something is fair?

“Fair could just be that if you're a female student the algorithm has the same accuracy for your prediction as if you were a male student. Now at least people are often aware that their algorithm might be biased and maybe it needs to be checked. I feel like it’s a first step and if I look at my research field, I would say 10 years ago no one talked about fairness, everyone just talked about accuracy. Now it’s completely changed,” said Tanja Käser, Assistant Professor and Head of theMachine Learning for Education Laboratory.

West agrees that definitions of fairness will be different for different people and computer scientists need to actively embrace these conversations, “It's a bit like the atomic bomb - when you have the technology, you can build power plants and you can build bombs. We have the search engine, it allows us to do fantastic things but it also has negative side effects. I definitely think computer scientists should take ethics classes. Computer scientists shouldn’t make these decisions on their own, it’s important to be to be aware of the evaluation criteria that will at some point be applied to their technology.”

As this problem is a broader reflection of the biases in society, of how we are raised and how we see the dynamics of our parents’ relationships and working lives, Käser believes we need to start educating both boys and girls at a young age.

“We should start with math and STEM at elementary school level, we need to make it cooler for girls to do STEM but this isn’t going to change fast. I like to emphasise the cool things that we can do with computer science that have an impact on society. My topic, how machine learning can optimize human learning, is interdisciplinary and I'm doing technical things, but it's very human and I can have an important impact. I hope I can be a good role model, this is so important,” she said.

Ailamaki has a slightly different perspective. “We can’t do anything about the lady next door who thinks blue is for boys and pink is for girls, she has an opinion and we need to respect that, but we can make changes in a professional environment. We need to promote a genderless world in the workplace, one where you are judged for your ability, not whether you are a man or a woman. Then it’s a question of how far back we go. We need to stop naming kids boys and girls while in school and we need to stop teachers from telling kids ‘we’re going to give you this question because you are a boy and you this question because you are a girl’, but outside, people need to be who they are.”

West, who has three young daughters, often thinks about the world that they will grow into and his role in shaping the future, “I want them to be able to do whatever they want without social constraints. Maybe this sounds silly but I think it's important to be nice and welcoming to everyone and that's maybe even more important earlier on. I think there are a lot of girls that shun away from certain activities because they would be the only girls doing them. Thinking about what we have to change at universities is important, and yes we have to change, but I’m not sure that is the biggest bang for the buck. And, we need to involve men in the conversations and solutions, we are 50% of the population and we need to be as proactive as women to create a less biased future.”


Author: Tanya Petersen

Source: Data-Intensive Applications and Systems Laboratory DIAS

This content is distributed under a Creative Commons CC BY-SA 4.0 license. You may freely reproduce the text, videos and images it contains, provided that you indicate the author’s name and place no restrictions on the subsequent use of the content. If you would like to reproduce an illustration that does not contain the CC BY-SA notice, you must obtain approval from the author.