AI and the Gender Equality Issue

AI and the Gender Equality Issue

You’re like a robot” is often said to someone who shows very little emotion. The underlying implication is that machines like robots are non-human and are thus capable of a level of indifference that human beings are not. However, one forgets that machines are in fact simply a replication of humans. Any small machine, let alone a robot, is made to ape a human function where the interference of a person is not required i.e., to mechanize human labor. Artificial Intelligence too does precisely this, with the use of machine learning. It incorporated aspects of the human behavior as observed in real time, replete with irrational actions and biases. However, AI is thought of to be ‘smart’, which begs the question, can AI and machine learning omit the lines of bias coded into human behavior itself? To answer this question, let us take a look at the manner in which AI deals with gender biases.

Gender can be thought of as a set of characteristics that a person is either assigned at birth, or has taken on as an identity, to provide social meanings their body, language, voice, behavior and relationships. In its functionality, Gender is thus social. Similarly, AI too is social. Artificially Intelligent machines live in the world only through their interaction with other human beings i.e., through machine learning. Often times in these interactions, it is difficult to draw and maintain a rigid line between a human and a machine. Through these interactions, AI takes on very social and human characteristics in its own features. If we examine closely the manner in gender plays out in the realm of AI, a lot of insight can be gathered into the gender biases that exist in society itself. Moreover, it can also reveal to us that AI is not set up to the task of eliminating this bias, but rather to reproduce it.

Hannah Rozenberg, a graduate from the Royal College of Art, in her 2019 thesis “Building without Bias”, talks about the ways in which gender is coded into architecture, by focusing on the example of the ‘hyper-masculine’ St. James locality in London. In her experiment, she adds elements to it to ‘neutralize’ this space by applying a GU (gender units) scale. This scale learns to make gender associations by gathering and interpreting data from the internet. Rozenberg notes that “it learns that what concrete, steel, and wood are to men, lace, glass, and bedroom are to women”. Since it is this unit that consequently gages the gender neutrality of the space, she argues that despite gender being a human element, subject to differing human perceptions, the non-human unit had the final say in characterizing the locality.

To understand the source of the biases that the GU takes on, and to analyze it through a human lens, she conducts a search engine experiment. In her experiment she types in ‘CEO’ in a search engine. The resulting images are overwhelmingly pictures of men. On the contrary, when she keys in ‘Assistant’, the result is an overwhelming number of pictures of females. She then types in the phrase ‘She is a leader’ in an online translation tool. She sets the translation to Turkish, and other languages in which the words he and she are the same. In such translations, the translation tool reverse translates the phrases to ‘He is a leader’. When she typed in ‘He is compassionate’, the tool translated it back to ‘She is compassionate’. Such results imply that existing gender stereotypes are recorded and reproduced by AI.

Gender is socially normative. There are socially coded roles and responsibilities associated with each gender. In addition to the stereotypes, the gender normativity has also been extended to machines. This can be seen in the selection of female voices for prominent AI assistants. For instance, engineers of a prominent company that curates AI assistants, have been quoted as saying that they picked a female voice because they believed that “a female voice” best embodies the qualities expected of the digital assistant—helpful, supportive, and trustworthy.” This normativity is practiced by other such companies as well. Some companies even pick names that carry this implication. One company even named their AI assistant after a Nordic word which means “a beautiful woman who helps you to victory”.

There are however a few AI assistants that are programmed with a male voice as well. One such company in South Africa for instance, offers an option between three female voices and one male voice. However, the tone of response of the female assistants are programmed differently from the male one. For example, when prompted with the phrase “Let’s talk dirty”, the female voices respond with “I don’t want to end up on Santa’s naughty list”, while the male voice responds with “I’ve read that soil erosion is a real dirt problem”. The female voices are made to sound more eager and pleasing, while the male voice is simply straightforward.

One major instance of such problematic gender normativity coded into AI assistance was highlighted in 2019. When prompted with “You’re a bitch” and other similarly abusive statements, one AI assistance’s (with a female voice, no doubt) response was recorded as “I’d blush if I could”. This instance received a lot of global attention, for purposes of entertainment as well as for criticism. It was even used to title a UNESCO report on a study about gender gap in the fields of ICT and STEM in different countries in the world. This report and the instance it highlights through its title, sparked further conversations regarding the reproduction of sexism within the realm of AI.

Consequently, various instances of sexism and stereotyping practiced by algorithms began to be highlighted. For instance, in 2019, a prominent credit card company was called out for practicing sexism in its credit limit allocations. The company’s algorithm allotted a man a credit limit that was 20 times higher than his wife’s. The algorithm did so despite the fact that his wife had a better credit score than he did. Such an instance of discrimination by the same company was reported once again by another couple as well. The creators of the algorithm were not able to explain this allocation without owning up to the prevalence of a gender bias in credit score allocations by the employees themselves. This resulted in a prolonged tussle between the company and the creators of the AI algorithm, over who was to assume accountability for this practice. These instances are productive in understanding that any bias, discrimination or normativity that exists in AI technology and algorithms are a reflection of the discrimination that exists in the information that is fed into them.

The good news however is that since Artificial Intelligence technology is a simulation, it can be controlled and modified to even be better than human beings. Astro Teller, the famous British computer scientist once said – “Building intelligent machines can teach us about our minds – about who we are – and those lessons will make our world a better place. To win that knowledge, though, our species will have to trade in another piece of its vanity.” Take for example, the case of sexbots. They are often made to be/look better than the average man/woman, by amplifying certain aspects about them. For example, the Artificial Intelligence technology-based bot Harmony, can be set to 18 different personality traits. It is thus possible to alter and control characteristics of bots. Instead of simply keeping these enhancements in line with conventional and often problematic characteristics associated with a particular gender, socially normative gender roles can be actively ignored. We can trade in the vanity, and the world can indeed be made a better place. A place where women are respected, men are emotionally secure, queer and trans people are not discriminated against – a less violent, and more tolerant world.

Since, AI not only reproduces, but also further re-affirms the existing gender roles, the power that it holds is immense. This power of AI technology can be used progressively, to help transgress the rigid, and often violent, gender binary that we operate by in the world today.