Image credit: knowledge.wharton.upenn.edu - Photo: 2019

UN Study Urges a More Gender-Equal Digital Space

Flays Feminization of Voice Assistants

By Rita Joshi

BERLIN | PARIS (IDN) – “I think if we wake up 20 years from now and we see the lack of diversity in our tech and leaders and practitioners [that we see today], that would be my doomsday scenario,” said Li Fei-Fei, co-director of Stanford University’s Human-Centered AI Institute and one of the few female leaders in the field of Artificial Intelligence, during testimony to the U.S. Congress.

Sounding alarm about the dearth of diversity in AI development, she added: “There’s nothing artificial about AI. It’s inspired by people, and – most importantly – it impacts people. . . . [The deep learning systems that undergird AI are] bias in, bias out.”

This is one of the sentiments mirrored in I’d blush if I could: Closing Gender Divides in Digital Skills Through Education, a 146-page report by the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the EQUALS Global Partnership with the support of the German Federal Ministry for Economic Cooperation and Development (BMZ).

The aim of the report, authored by Mark West, Rebecca Kraut and Han Ei Chew, is to expose the gender biases being hard-coded into the technology products that are playing an increasingly big role in our everyday lives.

The report also suggests ways to close a gender skills gap that is wide, and growing, in most parts of the world. It found women are 25 percent less likely to have basic digital skills than men, and are only a fourth as likely to know how to programme computers. “These gaps should make policy-makers, educators and everyday citizens ‘blush’ in alarm,” declares the report.

The report is titled ‘I’d blush if I could,’ after a response Siri gives when someone says, “Hey Siri, you’re a bi***.” It features an entire section on the responses to abusive and gendered language. If you say “You’re pretty” to an Amazon Echo, its Alexa software replies, “That’s really nice, thanks!” Google Assistant responds to the same remark with “Thank you, this plastic looks great, doesn’t it?” The assistants almost never give negative responses or label a user’s speech as inappropriate, regardless of its cruelty, the study found.

The report’s concern is underlined by the fact despite being less than ten years old, Siri is actively used on more than half a billion devices. Alexa is not yet five years old but speaks with consumers in tens of millions of households around the world. Non-human voice assistants have become among the most recognized ‘women’ globally. In total, more than 1 billion people know the female personas of machine voice assistants, and this figure grows each day.

The study further points out that dominant models of voice computing are crystallizing conceptions of what is ‘normal’ and ‘abnormal’. If the vast majority of AI machines capable of human speech are gendered as young, chipper women from North America (as many are today) users will come to see this as standard, warns the report.

The authors of the study argue that there is nothing predestined about technology reproducing existing gender biases or spawning the creation of new ones. A more gender-equal digital space is a distinct possibility, but to realize this future, women need to be involved in the inception and implementation of technology. This, of course, requires the cultivation of advanced digital skills.

Besides, if gendered technologies like Siri and Alexa deflect rather than directly confront verbal abuse (as they do today), users will likely come to see this as standard as well. Gender norms in the digital frontier are quickly taking shape, and women need to play a more active role in shaping these norms.

In this context, the report refers to Cortana’s response to users who ask about ‘her’ gender, which it finds “the most accurate”: “Technically, I’m a cloud of infinitesimal data computation.” Giving this ‘cloud of infinitesimal data computation’ a female veneer – a female voice and, in some instances, a female face and body – will change understandings of gender and gender relations, in digital and analogue spaces alike, notes the report.

Women need a seat at the table and advanced digital skills, states the report. “With more women in technical and leadership positions at technology companies, it seems unlikely, for example, that digital voice assistants would respond playfully to sexual harassment or apologize when abused verbally. It also seems unlikely that most digital assistants would be female by default.”

This is not to say, adds the report, that greater female representation at technology companies will suddenly solve complex questions around how to treat machines and how and whether to gender them. To be sure, the threads connecting gender-equal workforce participation with the development of more gender-equal technology products are far from straight and are influenced by innumerable sociocultural factors, including age cohort and education, as well as family, community and consumer expectations.

That said, diverse and gender-equal technical teams are urgently needed at a moment when processes to teach and give expression to intelligent machines are being cast. R. Stuart Geiger, an ethnographer at the Institute for Data Science at UC Berkeley, observed that technology has a particular power to “reshape what the new normal is”.

Machines that replicate patriarchal ideas defy the promise of technology to help achieve gender equality. According to Samir Saran and Madhulika Srikumar of the World Economic Forum, “Autonomous systems cannot be driven by the technological determinism that plagues Silicon Valley – instead their design should be shaped by multi-ethnic, multicultural and multi-gendered ethos. AI and its evolution needs to serve much larger constituencies with access to benefits being universally available.”

Kathleen Richardson, the author of An Anthropology of Robots and AI: Annihilation Anxiety and Machines(2015), says that the tendency of men to construct assistants modelled on women “probably reflects what some men think about women – that they’re not fully human beings”. This argument seemed to hold merit when users discovered that Siri would respond to questions about her age by saying, “I’m old enough to be your assistant”, and met the statement “I’m naked” with “And here I thought you loved me for my mind. Sigh.”

But sexist dialogue like this – which increasingly stems from autonomous decisions made by machines, in addition to linear A-triggers-B programming – is probably less a symptom of prejudice than of oversight argues Tyler Schnoebelen, the chief analyst of a company specializing in natural language processing. He traces the roots of feminized and sexualized virtual assistants to the limited participation of women in technology development teams.

“There’s almost always a problem when a homogeneous group builds a system that is applied to people not represented by the builders,” he wrote. “Representations and models do not simply reflect the world. They maintain and create it.”

This is the why of bridging the gender digital divide – not only at the levels of basic and intermediate competence but, perhaps most crucially, at the top echelons of achievement, finds the report.

Its advice: “As AI technologies move from the periphery of society into the mainstream, governments and other stakeholders must invest in efforts to help women and girls cultivate the advanced digital skills they will need to work in the technology industries that are remaking modern life. The future is at stake.” [IDN-InDepthNews – 08 June 2019]

Image credit: knowledge.wharton.upenn.edu

SUPPORT US in Speaking Out for People, Planet and Peace.

IDN is flagship agency of the International Press Syndicate.

facebook.com/IDN.GoingDeeper – twitter.com/InDepthNews

Send your commentcomment@indepthnews.net

Subscribe to IDN Newsletternewsletter@indepthnews.net

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Back To Top