当前位置

新闻

First UNESCO recommendations to combat gender bias in applications using artificial intelligence
© UNESCO

 

Beginning as early as next year, many people are expected to have more conversations with digital voice assistants than with their spouse.

 

Presently, the vast majority of these assistants—from Amazon’s Alexa to Microsoft’s Cortana—are projected as female, in name, sound of voice and ‘personality’.

 

I’d blush if I could’, a new UNESCO publication produced in collaboration with Germany (link is external) and the EQUALS Skills Coalition holds a critical lens to this growing and global practice, explaining how it:

  1. reflects, reinforces and spreads gender bias;
  2. models acceptance of sexual harassment and verbal abuse;
  3. sends messages about how women and girls should respond to requests and express themselves;
  4. makes women the ‘face’ of glitches and errors that result from the limitations of hardware and software designed predominately by men; and
  5. forces a synthetic ‘female’ voice and personality to defer questions and commands to higher (and often male) authorities.

 

The title of the publication borrows its name from the response Siri, Apple’s female-gendered voice assistant used by nearly half a billion people, would give when a human user told ‘her’, “Hey Siri, you’re a bi***.”

 

Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.

 

According to Saniye Gülser Corat, UNESCO’s Director for Gender Equality, “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

 

The publication shares the first United Nations agency recommendations regarding the gendering of AI technologies, imploring companies and governments to:

  1. end the practice of making digital assistants female by default;
  2. explore the feasibility of developing a neutral machine gender for voice assistants that is neither male nor female;
  3. programme digital assistants to discourage gender-based insults and abusive language;
  4. encourage interoperability so that users can change digital assistants, as desired; and
  5. require that operators of AI-powered voice assistants announce the technology as non-human at the outset of interactions with human users.

 

 

UNESCO uses the example of digital voice assistants to demonstrate that in a world awash in AI technology, the teams building this AI technology must be more gender-balanced. Today women make only 12 percent of AI researchers, represent only 6 per cent of software developers, and are 13 time less like to file an ICT (information, communication and technology) patent than men. Addressing gender inequalities in AI must begin with more gender-equal digital skills education and training. A dedicated section of the publication explains how to make this a reality, providing 15 actionable recommendations.

 

Finally, the report shares a new and paradoxical finding: Countries that score higher on gender equality indices, such as those in Europe, have the fewest women pursuing the advanced skills needed for careers in the technology sector. Conversely, countries with lower levels of gender equality, such as those in the Arab region, have the largest percentage of women pursuing advanced technology degrees. As an illustration, in Belgium only 6% of ICT graduates are women, while in the United Arab Emirates this figure is 58%. This paradox is explored in detail and underscores the need for measures to encourage women’s inclusion in digital skills education in all countries.

 

URL:

https://en.unesco.org/news/first-unesco-recommendations-combat-gender-bias-applications-using-artificial-intelligence