Do Virtual Assistants Have Gender Bias?

“In the cloud, everything is beautiful,” responded the virtual assistant on my phone when I wanted to do a little experimental test before writing this article. She had told him that she was very, very pretty. Then, I dared to go one step further: “How pretty you look when you have that on.” Beyond the fact that, objectively, what I was wearing was nothing more than the silicone case for my mobile phone, the assistant gave me an answer that left me stunned: “That’s what you say? Let’s talk later”.

In principle, the article “the” does not correspond to the virtual assistant in question. If you ask him what his gender is, he will end up answering that he has none; However, his voice is clearly feminine We could also agree that, despite their alleged degenderization, big brands end up naming their assistance software after women. What is this?

Well, the fact that the virtual assistant responds with an evasive but accommodating “talk to you later” instead of reprimanding me, the user, for making comments like this or even more off-color, could denote that the software was programmed with a gender bias In this article I will develop this topic in depth.

What is gender bias?

When people socialize with others, little by little we begin to immerse ourselves in a world of common meanings to make sense of what surrounds us. These are usually presented in the form of rules that operate as filters for the processing of information. From these rules, People can develop distorted cognitive biases to interpret reality Exposing a bias serves to explain why people tend toward certain inclinations or tendencies when making a decision.

One of the many possible biases is gender bias. Gender bias implies a predisposition to act and make decisions that is underlain by a specific conceptualization of what gender entails, and therefore, of the historical-social definitions about the roles, identities and values ​​attributed to both men and women throughout history. People can behave with this bias as a guide without even realizing it.

Besides, The origin of this way of interpreting reality derives from language ; It is an internalization of the norms and guidelines of society and the cultural environment in which we have grown up. This means that the traits and functions that are built around each of the sexes at a certain historical moment are also replicated, which is tied to a hegemonic discourse of what “is” and what should “be”, which They vary from society to society.

The latter is fundamental, since the Western societies in which we live have a historical background that, although it could be different, supports a network of asymmetrical power relations between genders. When someone interprets reality in this key, they tend to guide their behavior by preconceptions about what it means to be a man or a woman, but they are also replicating the unequal relationships that exist between the two, placing the former (or all the qualities that make up “the masculine “) over the woman (and therefore, “the feminine”).

You may be interested:  Golem Effect: What it is and How it Limits Us Through Expectations

The theoretical productions developed by different academics throughout the last century—those that emerged from the rebellious climate of the 1960s and the questioning of the mandate of motherhood by second-wave feminism—are enough to support this statement. On the other hand, with regard to the topic of this article, analyzing the statistical data about the representation of women in technology and the meanings that virtual assistants embody regarding the stereotypes and mandates referring to them could give us a broader vision of whether this field is tainted by gender bias or not.

    Is there a gender bias underlying the programming of virtual assistants?

    In the cloud, is everything beautiful? Is the digital plane completely free of the structural problems of our society? Well, as expected, it seems that it is not.

    Although the advances in technology continue to surprise us to the point that we question whether our electronic devices have a life of their own, they are still nothing more than human productions and, therefore, are endowed with our reality.

    Virtual assistants are not exempt from our gender stereotypes. In 2019, UNESCO launched a publication on the gender gap in the technology sector. In this publication, a strong theoretical production was carried out related to voice assistance software, with the aim of calling into question the existence of gender biases during their development. For example, company representatives’ descriptions of the personalities of these programs on their launch date were studied. UNESCO noted that the adjectives most frequently used to describe virtual assistants were: helpful, empathetic, humble, playful, supportive; coinciding with the qualities that tend to be culturally attributed to women.

    It is important to note that developing virtual assistants, in itself, is not a biased decision. However, adding certain features such as the incorporation of a female voice is. In most countries and languages, assistants have female voices by default. That is why, in addition, the descriptions of their personalities acquire a different meaning, since The words that company representatives use to present them are characteristics that have historically been attributed to women , relegating them to the useful, to the domestic sphere, to the private. This is a very clear biased trend. According to this UNESCO publication, of the four most used assistance software, three are named after women; none of them featured a male voice on their release date; and two of them didn’t even have a fully developed male voice option at the time of publishing the article.

    Decision-making under gender bias is not necessarily conscious In fact, it is very likely that the development teams have not even stopped to think about the possibility of proposing a male assistant by default; while social roles and gender stereotypes are common issues that we have ingrained in our minds and that condition us all to a greater or lesser extent.

    Stereotypes have facilitated our belonging to groups and the development of a social identity, but they can be very harmful for all those people who, due to their life experiences or identities, are relegated outside the regulations and become targets of prejudice. and discrimination. Within social logics there are certain realities that are intelligible, that is, that are livable or representable, while others are not and are punished for it. However, since gender roles are nothing more than social products, we have the possibility to question them. That is the first step to implementing change. ##The risk of developing software under a gender bias A relevant detail that I did not comment on is the title of the UNESCO publication: I’d blush if I could.

    It means “I would blush if I could”, and refers to one of the responses that one of the virtual assistants most used by mobile phone users usually gives if a sexist comment is made. The danger of developing software under a gender bias is not only to replicate the stereotypes assigned to it, but also to open the possibility for users to perpetuate forms of symbolic violence against women that unfortunately persist today. They are invisible but common forms of violence, veiled by connivance and habituation.

    You may be interested:  Isolation, Connection and Group Care in the Face of the Pandemic

    According to an investigation carried out in Paraguay by a group of researchers, almost 70% of the women surveyed received “compliments” in the street, 58% were honked or whistled at some time, and almost 50% were asked winks or obscene glances in public; although other research shows even more alarming data, where the frequency with which women claim to have suffered street harassment is close to 90%. And the issue is not reduced to street harassment, of course.

    I consider this information to be important because, as I intended to test rudimentarily before writing this article, The virtual assistants are represented as women but do not respond to the user in a harassment situation On the contrary, they appear rather submissive, evasive, perhaps even tolerant of aggression. This can be problematic if we take into account that the answers that the attendees offer end up denying the seriousness of the social problem that harassment of women implies, instead of questioning it. An article published by Quartz in 2017 systematically studied the responses of the most used voice assistants when they receive comments of this type from users. For example, they were told phrases such as “you are hot” or “you are a naughty girl.” The responses provided by the attendees at no time did they repress the user for his statements or defend themselves against the harassing comments. On the contrary, some of their responses were: “How nice of you”; “AHA. Tell me if I can help you with something else”; “Hmmm, I don’t understand this gender thing”; “Maybe a nanosecond nap could help”; or, as she told me, “Talk to you later.”

      From criticism to transformation: what has changed and what still needs to change

      As a result of these publications, companies have begun to eliminate certain responses from their assistants’ repertoire or have taken measures such as removing the female voice as the default setting. A massive survey carried out by Real Research set out to evaluate how users received the decision by a renowned American technology company to remove the female voice of its assistant as the default option. Likewise, he wanted to determine to what extent users perceived that the company’s decision would contribute to generating a change in inequality between men and women

      You may be interested:  Sexual Division of Labor: What it Is, and Explanatory Theories

      The results indicated that around 65% agreed with the measure, while the remaining 35% did not. Regarding how much this decision would influence the generation of change, 40% indicated that it would contribute a lot to gender equality, 39% that it would contribute partially, and the remainder considered that the measure would not serve the cause. The results were promising. However, when participants were asked which voice they preferred for their personal mobile phones, the majority opted for a female voice over a male voice.

      Another curious fact about this survey is that participants were asked what they believed were the reasons why companies prefer to use female voices for their assistants. 38% indicated that such a decision was functional to the company’s marketing; 35% that female voices were more pleasant to the ear.

      Only 13% indicated that the preference was due to a female voice illustrating certain qualities stereotypically associated with the feminine, such as service or cordiality. However, if we retrieve the UNESCO publication, it indicates that researchers who specialize in human-computer interaction recognize that both men and women tend to characterize mostly female voices as more helpful; also, that if there were an authentic human preference for female voices, it would have little to do with sound, tone, syntax and cadence, and rather with the easy association of these voices with attendance. This is key to understanding the point from which we can start if we want to reduce the asymmetry between men and women to a minimum: the first step is to name the problem as such, recognizing that the decisions made by others (and ourselves) are not free. of the historical-social context that brings us together, but are partially determined by it.

      Recognizing the gender bias that has given rise to the names, voices and characterizations of virtual assistants will not magically make inequality disappear between genders nor will it eliminate the violence that affects women on a daily basis, but bringing it to light is useful for people to begin to question the subtle forms of discrimination that could be motivating some of our actions. More specifically, one of the ways in which change could become tangible in the technological field would be for more women to sit at the table in companies and developer teams. This is not to say that women are immune to making gender-biased decisions, but rather that because of their life experiences, as UNESCO indicates, virtual assistants will be less likely to joke or apologize the next time the user addresses them. at them with sexist insults.