Raygun sponsored this podcast.
Why is it every time I’m yelling at self-checkout tills or rolling my eyes at task-based AI, it’s always by-default a female voice annoying me? Is that my internal bias against fellow women? Or is it just that I’m frustrated with the beginnings of tech, but all those AI beginners seem to be “feminine?” These are the thoughts that ran through my head as I was listening to Dr. Charlotte Webb‘s talk “Designing a Feminist Alexa: An Experiment in Feminist Conversation Design” at Skills Matter London’s Beyond Tech conference. And it’s what we explored when we spoke, following her talk, on this episode of The New Stack Makers.
Webb is a co-founder of the Feminist Internet, a non-profit organization on a mission to advance internet equalities for women and other marginalized groups through creative and critical practice. The Feminist Internet blends art and design practices, critical thinking, and creative tech development together with pilot projects and soon AI consulting.
So what happens when you add feminist decision making to create artificial intelligence and chatbots?
While the talk plays with the title “Feminist Alexa”, it’s a proxy for the predominantly female named and voiced first-generation AI personal assistants we interact or attempt to interact with every day — Alexa, Cortana, Siri, Viv, Mitsuku, you know, the chatbot Spice Girls. (So tell me what you want, what you really really want…) The official reasoning they are all fem-by-default is usually that people feel more comfortable talking to soothing female voices. The deep-rooted sexist reason is that our current use of AI is very administrative and homey — setting timers and alarms, making lists, searching, and shopping — which is historically relegated to the female of the species.
Plus, it could be the result of who is making AI tech. According to the AI Now Institute’s 2019 study on gender, race and power in AI, only 18% of authors at AI conferences are female and more than 80% of AI professors are male. Women comprise only 15% of Facebook’s AI and 10% of Google’s AI. Only 4% of Facebook and Microsoft staff is black, while only 2% of Google is.
A lack of diversity leads to bias.
chelsea peretti’s opening speech at the 10th annual tech crunchies. points were made. pic.twitter.com/LfbRHcuMkN
— ♥️ (@juristupidity) April 20, 2019
If we are going to continue to rely on and interact with chatbots and AI, and as their usage is moving rapidly passed gimmicks to sales reps, customer support, and technical and business processes, it’s essential that we understand any biases they have — or we have in response to them — and work to overcome it.
In order to overcome this bias in AI design, Webb’s team and a diverse group of university engineering and design students decided to largely follow Josie Swords‘s process to design feminist chatbots, which builds on and draws from Judy Wajcman’s 2004 Technofeminist Framework. It focuses on five areas of concentration and introspection:
- What is the purpose of our chatbot? What ecosystem does our chatbot sit in? Tech is never in isolation, what’s interacting around it?
- How have we treated and de-biased any data?
- How might our own biases as individuals and as a team be embedded in our designs?
- Instead of trying to design a “chatbot for everyone,” what marginalized user could benefit from it?
- How are we planning to depict our chatbot to users?
And that’s how, over three days, eight prototyped feminist chatbots were born. Watch the video below to get the context of the projects and listen to this episode of The New Stack Makers to understand the ethos behind feminist AI and how to apply it into your work as a developer or architect. Then, check out F’xa, the chatbot that is actually helping people learn about AI bias.
In this Edition:
1:27: What is the Feminist Internet and why is it needed right now?
2:16: What made you want to focus specifically on artificial intelligence?
9:52: How can less diverse companies ameliorate that situation?
16:57: Is all of this open source? Can people build upon these projects?
19:34: The tech we deal with every day whether it’s Siri, Cortana, or Alexa is pervasive already, so what can be done to make our most common use like setting timers, searching, or shopping more inclusive of all different groups, not just women?
24:14: How do businesses evaluate their chatbot tech to see if it’s feminist or if it’s biased, is there a guide somewhere? How do you test that?
Feature image by Ales Krivec from Pixabay.