Nearly a decade has passed since Safiya Noble googled “Black girls” and found the search results were mostly pornographic – a discovery that drove her to explore how algorithms can perpetuate discrimination and inequality.
Google went on to fix that search engine issue, but Noble said the problem is far from solved. Research this year found Google’s advertising platform linked the search phrases “Black girls,” “Latina girls,” and “Asian Girls” to adult content ads.
“The Black girl search is the thread on the sweater we are trying to unravel. It’s a way into a larger conversation about the future of knowledge,” said Noble, who will speak on a digital rights panel at the Thomson Reuters Foundation’s annual event, Trust Conference, being held online on Wednesday.
Noble, a leading technology scholar at the University of California, Los Angeles (UCLA), went on to coin the term “algorithms of oppression” and has written a book with the same name.
“Algorithms of oppression are everywhere,” she said. “I’m talking about computational software that might be embedded in large-scale platforms – or even household brands – that are used to disenfranchise, marginalize, and misrepresent.”
That Google’s search engine had been built to assume those looking for information about Black girls were interested in pornography “reinforced misrepresentations,” Noble said.
Cathy Edwards, vice president of engineering at Google Images, said that because the company’s systems “are organizing information from a constantly-changing web, our results can sometimes show negative stereotypes.”
But she added in emailed comments that Google was “committed to making diversity, equity, and inclusion part of everything we do — from how we build our products to how we build our workforce.”
Beyond search engines, Noble said she was also concerned about algorithms used in a whole suite of technologies – from predictive policing software to facial recognition tools, or social media platforms that accelerate disinformation.
These technologies not only tend to single out minorities for unfair treatment, but also provide little recourse for them to seek redress.
“(These technologies) are particularly deployed towards vulnerable people oftentimes communities of color, poor people, and people who are the least empowered to resist,” she said.
Greater diversity in the technology industry would help, she said, but would not erase the harm technology can do to vulnerable groups.
“You can’t see the flaws in your system when you have a fairly homogeneous group of designers, who are designing for a diverse set of societies around the world, and communities,” she said.
“We know that every year Silicon Valley says it’s going to get more diverse. But there’s no improvement, even though for 10 years we’ve been gathering more evidence about the harms of their products.”
Noble applauds renewed public interest in scrutinizing and putting pressure on technology platforms, the so-called “techlash” that set in after the 2016 U.S. election.
Technology scholars, researchers and journalists are making an increasing effort to track how Facebook facilitates the spread of disinformation and contributes to a polarized electorate, she said.
Noble belongs to the “Real Facebook Oversight Board”, a group of academics and activists established to question the social media network’s policies and content moderation decisions in the run-up to this month’s U.S. election.
Facebook did not reply to a request for comment about the Board, but in September a spokesman said it was “mostly longtime critics creating a new channel for existing criticisms”.
Noble also welcomed potential government intervention to reduce the power of tech firms.
“When you have a monopoly advertising company controlling the information landscape you have a recipe for disaster,” she said. “Large tech companies prioritize their own profit at all costs.
“Who pays the price? We know it’s people who are less capitalized and minoritized, that’s just a fact.”