The Future Place
Something for us to Create
This is an abridged version of The Future Place talk given at AWS She Builds Day 2020.
Gender Diversity in Tech is Abysmal
In Australia, the gender diversity in technology is pretty appalling.
In year 12 computing classes, 19% of students are female and 81% are male.
In 1st year university Computer Science degrees, the stats are the same. 19% female, 81% male.
In the tech workforce, 16% are female.
Graduate salaries for women in technology are 14.8% less than for equally qualified men.
In the stem sector (STEM includes technology, along with science and engineering), the quit rate for women is 41%, which is more than double the 17% for men.
When you look at the quit rate of women for technology, it's even higher - 56%.
There are many proposed reasons for women leaving technology at such an alarming rate, these are often reported as being a lack of career progression and lack of flexible work practices. However, the reason I find most jarring is the one of perceived incompetence arising from conscious or unconscious bias.
A study by Github on their users found that code written by female coders was accepted 78% of the time, a rate that's 4% greater than than the acceptance rate for male coders.
However, this is only true if the gender of the coder was unknown.
Amore recent study was reported in the Conversation that discovered there was no statistically significant difference between men’s and women’s coding abilities.
However, women perceived themselves to be less competent than men.
There seem to be some really unhelpful biases at play.
So, at the moment, technology starts with less women,
Pays them less,
Assumes a lower level of competency
And then has half of them leave.
From both a societal and industry outcomes perspective, this seems sub-optimal to me.
Why do we care?
The lack of gender diversity in tech is bad for women, it's bad for men, and it’s bad for business.
A 2017 study by the Boston Consulting Group (BCG) has found that diversity increases revenue for companies.
The biggest takeaway they found was a strong and statistically significant correlation between the diversity of management teams and overall innovation.
Companies that reported above-average diversity on their management teams also reported innovation revenue that was 19 percentage points higher than that of companies with below-average leadership diversity—45% of total revenue versus just 26%.
Aside from innovation being improved and profitability increasing when there are diverse people and views involved, diversity can reduce a range of risks including stopping us from building dangerous tech.
Let’s think for a moment about Andrew Poole and Target’s pregnancy prediction score in the US.
A team from marketing at Target, approached one of their data scientists and asked “If we wanted to figure out if a customer was pregnant, even if she didn’t want us to know, could you do that?”
Mr Poole proceeded to synthesise customers’ purchasing histories with the timeline of those purchases to give each customer a so-called pregnancy prediction score.
Evidently, pregnancy is the second major life event (after leaving home for university) that determines whether a casual shopper will become a customer for life.
Target turned around and put Pole’s pregnancy detection model in an automated system that sent discount coupons to possibly pregnant customers.
A Win-Win, or so Target thought.
Right up until a teenager’s father furiously approached Target to ask why they were sending his teenage daughter coupons that were designed for pregnant women? Were they trying to entice his daughter to get pregnant?
Well, it turns out, his daughter was actually already pregnant.
By analyzing the purchase dates of approximately 25 common products, the model found a set of purchase patterns that were highly correlated with pregnancy status and expected due-date.
Target quickly found itself in a Lose-Lose situation, where it had lost its customers’ trust and was entrenched in a brand destructive PR disaster.
But, the teenager lost far more.
She lost control over private and personal information related to her own body and her own health.
...More Women Were Involved
Imagine for a moment, that women had been involved in building a maps application for your phone.
What would that look like?
What could a really great version of a maps application actually include?
· You could choose a path that had excellent lighting for walking at night,
· You could choose a path that was suitable for a pram,
· You could be taken to a door at your designated address which you can actually get through with a pram.
The beauty of considering these things is that they don’t just benefit a small sub-group of customers, they benefit everyone. Being able to choose a well lit route to walk at night is a great feature to include.
Whilst gender diversity in technology is abysmal, and is only improving at a glacial rate, let me tell you, as a Director for Women Who Code Melbourne, this lack of gender diversity isn’t even the biggest problem in tech.
Where are the people from:
· Different ethnic groups,
· Different socio-economic backgrounds,
· Different accessibility experiences?
…how bad is this lack of diversity?
We don’t even know!
Because we don’t track it or measure it in a meaningful way in Australia.
America has a few statistics available, and it makes for very depressing reading. I cannot say for certain that we are the same, but my gut feeling is that if we are not even measuring these things, then it's not a completely unreasonable assumption.
Why is Diversity Important?
Why do we need mixed backgrounds and life experiences?
Well, because so far, we’ve created some complete disasters :
Joy Buolamwini, a Ghanian-American graduate student at MIT, was working on a class project using facial-analysis software.
But she came across a major problem! The software couldn’t “see” Buolamwini’s dark skinned face (by“seeing”, I mean it couldn’t detect a face in the image). She tried many workarounds, such as wearing glasses and changing her hair. But, the only thing that worked, was when she put a white mask over her face.
Why is this?
Apparently she deviates from the average face too much.
The data set on which many of the facial-recognition algorithms are tested contains 78% male faces and 84%white faces.
Darker-skinned women were up to 44 times more likely to be misclassified than lighter-skinned males. It’s no wonder that the software failed to detect Buolamwini’s face: both the training data and the benchmarking data relegated women of colour to a tiny fraction of the overall dataset.
And what about the Compas Recidivism Risk Algorithm in America?
COMPAS– or Correctional Offender Management Profiling for Alternative Sanctions –suggests it can predict a defendant’s risk of committing another crime. It works through a proprietary algorithm that considers some of the answers to a137-item questionnaire and assigns them a score from 1-10, 10 being most likely to reoffend.
The news organisation ProPublica looked into one of these "recidivism risk" algorithms, being used in Florida during sentencing by judges and found some unsettling outcomes. You can read more here.
There are entire books on these and many other disasters, there is a reading list at the end.
It seems we have allowed some dangerous technologies to be built by a very homogenous group.
Can we just blame tech-bros?
Firstly, what is a tech-bro? A quick online search suggests that it can be defined as:
A subculture of mostly male, mostly white, American technological entrepreneurs and workers in & around silicon valley, exemplifying a hyper technocratic, libertarian, meritocratic, boys club.
Something between alpha-male tendencies, nouveau riche elitism and the privileged arrogance of the young.
I don’t believe that all men in tech are tech-bros. I do, however, think that the tech-bro sub-culture does exist, and it perpetuates a certain deficiency.
Caitlin D'Ignazio and Lauren F. Klein have proposed that this deficiency is based on privilege hazard. Privilege isn’t just reserved for tech-bros, many of us have it in one shape or form.
Privilege hazard comes from the ignorance of being in the dominant groups.
When people are from dominant groups, those perspectives come to exert an oversized influence on decisions being made – to the exclusion of other identities and perspectives.
A downstream effect of privilege hazard is that our products are at best poor, and at worst dangerous.
How do we overcome privilege hazard?
Well, it's not that hard to solve - we bring more actors into the play.
…. We brought everyone in?
What if we design a process, a new process, in which many different actors can participate?
What if we changed our mindset to truly value the opinions and experiences from many different perspectives, not just the few who learned to write code well?
Our tech products could be built with feedback and collaboration from many different experiences.
We need to actively and deliberately invite other perspectives into the tech development process.
We could include knowledge from people with:
· Technical expertise,
· Lived experience,
· Domain expertise,
· Community history.
Let’s go back to our mapping app; what other features would be asked for if we talked to more people?
Pram and wheelchair routes,
Routes that combine a bike path with public transport you can take your bike on,
The option to select a site-seeing route,
What about the elevation of a route,
Help to find toilets (clean, accessible toilets, or those with baby change facilities),
What about the option to choose a well lit route,
Nearby rubbish and/or recycling bins.
…We Changed Our Processes?
We could hold our teams and organisations to account by requiring products and features to produce documentation specifying these things:
Who was on the team?
What were the points of tension?
What caused the disagreements?
Which hypotheses were pursued, but were ultimately proved false?
Did the tech team talk to end users, domain experts and communities?
We could make this an enforced part of our processes. Not as a punishment, but as a‘checklist’, just like a doctor going into surgery would do.
This is important, because – just like those in the medical profession – the actions we take and the choices we make go on to impact peoples’ lives.
... We Changed Our Tooling?
There are many tools on the market that enable feedback, contemplation and collaboration.
Ethics LitmusTests are a great way to start this journey. They are a set of cards with provocations to generate discussion when you’re designing a product or feature.
You start by describing the problem scenario, or motivating concern. Usually these are sourced from current or recent work, or they can be a recurring niggles.
They can be very broad, for example:
“I’m not sure we have thought through the consequences”
Or they could be quite specific, such as
“What if our automated decision algorithm is ageist?”
Then you work through the card provocations to discuss it.
The Preview Links here at Linc also help. They allow organisations to easily share product and feature development as it is happening.
Preview links provide you a shareable URL for every commit against every backend…..
Think that through for a moment, what you could do?
Who you could share with?
Who you could get feedback from?
How great your products could become?
Tech can be full of roadblocks that stop people from being able to become actively involved in the product development process. As a community, we need to look at ways to reduce these and be able to share the products we’re building as we build them.
Just because someone doesn’t know how to run your code on their own computer while working-from-home doesn’t mean they don’t have valid knowledge and feedback to provide. Feedback that could help build a better product and also reduce the amount of developer re-work required.
With Linc’s Preview Links you can share your products and features with the design team, management, your CEO and your customers, and not just for UAT, but along the journey.
So, when as finish reading this, think about the tech you create.
1. Who are the people you’re involving?
2. What are your processes like?
3. Could your tooling be better?
The future isn’t a place we get to go
It’s a place we create
Weapons of Math Destruction by Cathy O’Neil
Technically Wrong by Sarah Wachter-Boettcher
Race After Technology by Ruha Benjamin
Automating Inequality by Virginia Eubanks
Algorithms of Oppression by Safiya Umoja Noble