Should I worry about gender-biased Artificial Intelligence?
CGTN
02:56

What's the problem?

Artificial Intelligence is man-made, literally. The World Economic Forum's latest Gender Gap Report notes that only 22 percent of AI professionals are female. And while AI may seem futuristic to some, it's already gone from science fiction to societal fact: Adobe says that 88 percent of European companies will use AI for customer analytics by 2020, while Servion predicts that AI will drive 95 percent of customer interactions by 2025.

However, there are doubts over whether the rise in quantity will be accompanied by a rise in quality. Research firm Gartner calculates that by 2022, 85 percent of AI projects will deliver erroneous results due to bias in data, algorithms or the teams that manage and build them. The heart of the problem lies not so much with AI itself, which is agnostic, but how it is constructed by humans, who have their own in-built biases. AI is humanity's child, but its parents are overwhelmingly male. 

What might it mean for our future?

If the infrastructure than powers everything we do is male, it will cement and could even exacerbate gender inequality in the world. 

"If we don't build better and more transparent systems, we'll build in bias and we'll build in unfairness and that will disadvantage people," says Gina Neff, a professor at the Oxford Internet Institute at the University of Oxford. "It will look like it's technologically neutral and transparent but it won't be."

Think for example about virtual assistants, like Alexa, Google Home and Siri. By default, they're usually 'female'. University of Southern California sociology professor Safiya Umoja Noble calls this "a powerful socialization tool that teaches us about the role of women(...) to respond on demand."

What do the experts say?

"An algorithm is an opinion expressed in code," says Ivana Bartoletti, founder of the Women Leading In AI network. "If it's mostly men developing the algorithm, then of course the results will be biased… you're teaching the machine how to make a decision." Neff warns that women are less likely "to be represented in news articles, to be the subject of political news, to be contacted as sources of information, to be on Wikipedia, to be in medical data. And so when you're building technologies that scan for massive sources of information, we're simply leaving out information about women and information created by women."

And even seeking full gender equality in AI appointments may not completely solve the problem.

"We need to do social systems analysis for how data systems and AI will continue to influence people for decades," says Oxford University's Neff. 

"Having a woman in the room is a good start, but it's not enough. We really need to be building systems for all people, across a whole array of divisions in society. If we don't do these kinds of social, structural, systemic, holistic analysis of information and data we're going to end up with biased systems, and those systems are going to fail people."