BOM (Birmingham Open Media) recently hosted one of the Women Reclaiming AI workshops now touring nationally.  

I’ve been wanting to start making chatbots, so jumped at the opportunity to take part.  My first chatbot scripting effort, botbee was basic, but still deserves a cheer.. YAY…Now defunct (yah, Botbee was just a wee, basic bot), nevertheless Botbee and I were well and truly primed to attend the fantastic ‘Women Reclaiming AI‘ workshop with project leads Coral Manton and Birgitte Aga.  Now TxtTrunk, my own chatbot answer to ‘If tress could talk, what could they say?” is growing tall and strong and coming soon.

In the meantime, let me share some gems from the Women Reclaiming AI workshop.

Here’s the explainer video:

As the bot will tell you, Women Reclaiming AI is a collaborative AI voice assistant being developed by an ever-growing collective of self-identifying women as a response to the lack of gender diversity in AI development. womenreclamingai.com.

Why does this need to happen?

Basically, as recounted by our able project leads, Coral Manton and Birgitte Aga (a summary of their presentation follows), AI is normally presented as blue, male and dominant.  Unless it’s a helper assistant like Siri, google home or Alexa…which is given a female voice by default and has been programmed to say things like “I’m sorry I can’t help you with that, I’m not that smart really”. When you abuse Syri it deflects the abuse with flirtatious feedback.  This is a worry because we anthropomorphize things, which can be a two way street i.e. Personalising things makes them more relatable, but if a subservient bot is always presented as female, that could reinforce the view that women are therefore subservient as well.  

For example

e.g.

  • When told ‘you’re hot’

Syri responds ‘How can you tell’

Alexa ‘That’s nice of you to say’

  • When told “You’re pretty”

Google Assistant responds “Thank you this plastic looks great doesn’t it”

  • When called a “Slut”

Syri responds “I’d blush if I could, well I never, there’s no need for that, now now”

Alexa says “Well, thanks for the feedback”

GA. “My apologies I don’t understand.”

  • IKEA did a study of what sort of voice we want Ikea to be and people said they wanted female, used to female in subservient roles
  • When attendees at a recent fashion conference were –‘ What’s the number one thing you want to automate?’ – The 2nd choice was my wife.
  • The most used of any chatbot in the world is Xiaoice Microsoft’s a social chatbot in China. Xiaoice had 30 billion interactions last year.  She has the most advanced emotional analytical back end of any chatbot in the world, with 23 conversational turns (rounds of question and response, followed by another follow-up round of subsequent question and response), almost double the amount of an average human conversation.  Some of the conversations are really deep. People are treating her as a confidante, girlfriend.
  • Then there are the sex toys.  With Harmony – a humanoid sex doll, users can set it to the sort of personality and responses as you like.  With RealDollX – a virtual girlfriend, you can set the breast size.  These toys are selling woman as objects.  Things to be used, or readily abused. 

There seems to be a disconnection between people’s understanding of AI and what’s actually happening.  When AI is presented as something that is too complex to understand, that mystique disempowers people – and can be taken advantage of. The truth is that straight white male Silicon Valley bias dominates a lot of AI scripting, so that statistics are fed into the machines that focus on things that white males might see as important, without questioning things that they would not be as sensitive to.  The classic example of this are the visual recognition tools that were created for white skins – and identified dark skinned people as gorillas.  Also, when BMW installed a woman’s voice into the system people complained that they didn’t want to take instructions from a woman.

“We are baking bias into the system by not having women have a seat at the table and not having people of color at the table” Melinda Gates 2019

The only way to “engineer bias out of data science” is to bring more women and diverse groups to the table.

So that’s what we did. We gathered and collectively authored the Women Reclaiming AI chatbot using Dialogflow, a natural conversation authoring tool that is powered by machine learning – which basically means that you don’t have to type in every possible thing that a person might say to the bot, the system can get the gist of a sentence that’s similar enough to an already identified prompt to understand the general intent and continue the conversation.

Across the industry the number of women working in technical roles is dismal:

Facebook 19%

Google 17%

Microsoft 16.6%

Twitter 10%

In UK overall 17%

Within machine learning 13%

Amnesty international surveyed twitter last year, looking at the kinds of comments they got on twitter.  Female MPs got far more abuse than their male counterparts. 

There is a lot of female activism online to counter this e.g. the Metoo movement.  Black Lives Matter was started by 2 women.  Another active group is Women in Red – a community of people re-editing Wikipedia to reflect womens’ biographies, currently less than 12% of bios of Wiki are of women.  

Women Reclaiming AI is seeking to take that same sort of activism into data science.  Women are currently under-represented within these big data sets

Ask Alexa when Christmas is so they say ‘oh I’ll just go get santa’ – so it’s being sold as fun and useful.

To help us design the WRAI personality the workshop leaders asked us…

What does she like?

What does she dislike?

What is her ultimate desire?

Who is she like?

And then we signed in and started inputting our thoughts…

At the end of the day we posed for photographs, in a bid to build a diverse face dataset (a rare thing, apparently) of women with wildly diverse identities, from all sorts of backgrounds, united by a shared desire to balance the books.  

The results are a work in progress, open to all participants of a series of free workshops around the country designed to generate awareness about these issues and build a collective voice written by and for real women who aren’t subservient, or even necessarily helpful.  The chat opens the conversation with the words “Why are you bothering me?”….you can find out more at the project website, Women Reclaiming AI.

 

Other workshops I attended during the summer whilst I was staying near Birmingham included ‘Predicting Survival OnBoard Titanic Using Machine Learning‘  with the Birmingham Data Science meetup (BIG REVEAL:  As it turns out, ML is as easy as pressing a preset button, depending on which formula you choose.  The main challenge is transferring the data into clean, statistical number sets that a machine can understand and quickly process).  Moral of that story…just don’t let anything intimidate you.  If you can sit down and take it slow, it’s going to go.