Safiya Noble burst out in tears upon hearing the news of her MacArthur Fellowship — when she finally answered the phone after a week of believing the Chicago number was robocalling her.
Noble studies internet bias, and how search engines like Google or Yahoo exacerbate racism and bias against women. She’s founder and co-director of the University of California Los Angeles’ new Center for Critical Internet Inquiry.
Many people of color, women and LGBTQ researchers have tried to warn about the dangers that lurk online in the last decade, she says, but it wasn’t until the 2016 presidential election that people started catching on. The Cambridge Analytica scandal brought attention to the fact that search engines manipulate people’s thinking.
“There are many technologies that surveil us, collect data about us to manipulate us or to kind of steer us in particular ways or really to foreclose opportunities from us,” she says. “We think of it as some of the most important public interest civil rights work there is when it comes to the Internet.”
In her work, Noble looks at how search engine algorithms exacerbate racist stereotypes and bias against women. Algorithms are sets of instructions given to computers — but the public doesn’t know how this works on platforms people use every day, she says.
Many women experience racial bias and sexism in algorithms when searching words like girl or girls, which can bring up pages full of pornography, she says.
“There were just a number of racist stereotypes and sexism, and it was baked into the way the algorithm sorted information on the web,” she says. “Ten years ago when I made an argument like that, people would say, ‘Safiya, that is impossible because algorithms are just math and math can’t be racist.’ ”
But Noble is talking about how when applied to the millions of searches conducted on the internet, the mathematical formula brings up the worst possible results.
Many people assume that search results are based on what’s most popular — a myth that Noble wanted to unpack in her book “Algorithms of Oppression.”
For years, Noble studied particular phrases — “Black girls,” “Latina girls” and “Asian girls” — that bring up pornography without mention of words like sex or porn.
“Girls of color were synonymous with pornography,” she says. “They’re a small minority of the population relative to the whole, so they have no agency and control as a community, as individuals over being represented pornographicly.”
In the book, she says she also explored what it means when industries like pornography can “outspend and out-optimize keywords that belong to people and communities that have already faced a long history of racist and sexist misrepresentation in popular culture.”
People don’t realize that search engines are advertising platforms, not just knowledge spaces, Noble says. She wants people to understand that search engines work well for seeking information like what time a coffee shop closes — but not complex questions about people and history.
“These are nuanced kinds of queries that are actually being put into an advertising engine,” she says, “which means those who have the most money to pay are going to be more visible.”
One obstacle is that people trust large search engines like Google, she says.
Before Dylann Roof shot and killed Black churchgoers in South Carolina in 2015, he found white nationalist websites through search engines. In this painful case, Noble says Roof reported that he ended up on such sites because he was trying to understand the media coverage of George Zimmerman’s trial for the killing of Trayvon Martin, a key moment that ignited the Black Lives Matter movement.
“When Dylann Roof did those searches trying to understand ‘who was Trayvon Martin?’ and ‘who is George Zimmerman?,’ ” she says, “he did very quickly fall down a rabbit hole of white supremacist and white nationalist websites, anti-Black, anti-Semitic.”
Regulation that prevents search engines from gathering information about people over time and using it to steer their attention toward certain types of content could help, Noble says. She envisions a digital future where people understand Google as an advertising platform but also different kinds of search engines curated by librarians, teachers or experts.
“We could differentiate more so that when we have certain kinds of queries, maybe they take longer to be answered instead of an instant answer,” she says, “more like the difference between fast food and organic slow food.”
Karyn Miller-Medzon produced and edited this interview for broadcast with Jill Ryan. Allison Hagan adapted it for the web.
This article was originally published on WBUR.org.
Copyright 2021 NPR. To see more, visit https://www.npr.org.