Gender and Smart Learning Technologies
By Rohit Talwar and Helena Calle
How can we tackle gender imbalance in the personalities of AI learning tools?
The Gendering of AI
The expected growth in use of artificial intelligence (AI) in learning applications is raising concerns about both the potential gendering of these tools and the risk that they will display the inherent biases of their developers. Why the concern? Well, to make it easier for us to integrate AI tools and chatbots into our lives, designers often give them human attributes. For example, applications and robots are often given a personality and gender. Unfortunately, in many cases, gender stereotypes are being perpetuated. The type of roles robots are designed to perform usually reflect gendered over generalizations of feminine or masculine attributes.
Feminine personalities in AI tools such as chatbots and consumer devices like Amazon’s Alexa are often designed to have sympathetic features and perform tasks related to care giving, assistantship, or service. Many of these applications have been created to work as personal assistants, in customer service or teaching. Examples include Emma the floor cleaning robot and Apple’s Siri your personal iPhone assistant. Conversely, male robots are usually designed as strong, intelligent and able to perform “dirty jobs”. They typically work in analytical roles, logistics, and security. Examples include Ross the legal researcher, Stan the robotic parking valet and Leo the airport luggage porter.
Gendering of technology is problematic because it perpetuates stereotypes and struggles present in society today. It can also help reinforce the inequality of opportunities between genders. These stereotypes aren´t beneficial for either males or females as they can limit a person´s possibilities and polarize personalities with artificial boundaries.
Response Strategies
We propose four strategies to help tackle this issue at different stages of the problem:
- Mix it up – Developers of AI learning solutions can experiment with allocating different genders and personality traits to their tools.
- Gender based testing – New tools can be tested on different audience to assess the impact of say a quantum mechanics teaching aide with a female voice but quite masculine persona.
- Incentives for women in technology – By the time we reach developer stage the biases may have set in. So, given the likely growth in demand for AI based applications in learning and other domains, organizations and universities could sponsor women to undertake technology degrees and qualifications which emphasize a more gender balanced approach across all that they do from the make-up of faculty to the language used.
- Gender neutral schooling – The challenge here is to provide gender neutral experiences from the start, as the early stages experiences offered to children usually perpetuate stereotypes. How many opportunities do boys have to play with dolls at school without being bullied? Teachers’ interactions are crucial in role modeling and addressing “appropriate” or “inappropriate behavior”. For example, some studies show teachers give boys more opportunities to expand ideas orally and are more rewarded to do so than girls. Conversely girls can be punished more severely for the use of bad language.
A version of this article originally appeared in Training Journal.
Image: https://pixabay.com/images/id-3950719/ by john hain