Algorithm Activist

Safiya Noble is alerting the world to implicit bias in the inner workings of search engine optimization

Safiya Noble (Image by Stella Kalinina)
Safiya Noble is alerting the world to implicit bias in the inner workings of search engine optimization


Most Googlers search for something specific: song lyrics or a reliable neighborhood repair shop.

But Safiya Noble, MS ’09 IS, PHD ’12 IS, seeks to understand what isn’t shown—namely, results helpful to women of color and other marginalized populations. Her research highlights how content that actively harms them or is indifferent to their experience rises to the top of the rankings instead.

When Noble first became aware of digital injustice in graduate school, computer code was regarded as scientifically benign, certainly not something that could be prejudiced. She learned otherwise. Her dissertation became a best-selling book, Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press, 2018), and her voice joined a rising chorus of scholars who pointed out how new technologies can purposefully perpetuate past oppressions.

“Bias can, in fact, be embedded all the way down to the level of code. Algorithms can be harmful in society and have huge effects,” says Noble, now an associate professor of information studies and African American studies at the University of California, Los Angeles, and co-director—with Sarah Roberts, PHD ’14 IS—of the UCLA Center for Critical Internet Inquiry.

Noble was born and raised in Fresno, Calif., an agricultural hub. From an early age, she wondered why some neighborhoods lacked grocery stores and why undocumented farm workers were poorly treated. “Exploitation was part of the fabric of life,” she says.

During her ungraduate years at California State University, Fresno, Noble investigated these injustices, earning her degree in sociology and ethnic studies. By graduation, she understood how power inequities are endemic to the nation’s economic system, so she went to work to champion corporate social responsibility.

Her efforts moved eastward after she met her husband—a Rantoul, Ill., native—at an advertising agency in Oakland, Calif. The enterprising couple launched a company, ultimately moving it and themselves to Champaign to be closer to his parents. When the 2008 economic crash crushed their business, Noble decided to return to academia for further analysis of what was “under the hood” of search engine optimization. She applied and was admitted to what was then U of I’s Graduate School of Library and Information Sciences.

Chief among Noble’s inquiries: How did Google influence online representation? When she discussed that topic with classmate André Brock, PHD ’07 IS, now an associate professor at Georgia Tech, he made an offhand comment about results for the term “Black girls.” Noble tried the search herself and was shocked to primarily find pornography. “Women get coded as girls, and girls get coded as sexual objects,” she says. “When half the population is female, that doesn’t make sense to me. How is this happening?”

With the help of a programmer friend, she resolved to deconstruct the discrimination. At first, Noble faced resistance, even hostility. “People were saying, ‘You’re using the wrong keywords,’” she says of terms such as Black girls. “But these are the words women use. You’re telling me our words mean pornography or similar things? I don’t think so.”

Nearly a decade, a book and a TED Talk later, her ideas have more widespread acceptance, and a Google search on innocuous words no longer produces results that are decidedly NSFW (not safe for work). The Center for Critical Internet Inquiry just received a transformational $2.9 million grant from Australia’s Minderoo Foundation.

The support is timely. As algorithms play a larger role in everything from mortgage approval to political opinion to criminal recidivism, the stakes have never been higher.

Eliminating implicit internet bias may mean restricting some technologies, reimagining others, even requiring companies to pay reparations for past wrongdoing, Noble says. These solutions may sound drastic. But she sees a precedent in the financial settlements previously reached with tobacco companies—and hope in her belief that, based on her knowledge of how tech manipulates engagement, Americans aren’t as divided or hate-filled as social media platforms might make it appear. “I don’t believe Americans are YouTube comments,” she says. “When given better choices, people choose better.”


The University of Illinois Alumni Association presented its annual Alumni Awards as part of “virtual” Homecoming Week 2020, Nov. 29—Dec. 5. The UIAA honored the recipients of two Alumni Achievement Awards, the recipient of the Lou Liay Spirit Award, and recognized two Honorary Alumni. In addition, two new awards were presented, one recognizing an outstanding Young Alumni and the other championing noteworthy efforts to promote Diversity and Inclusion.

To see all recipients of the 2020 University of Illinois Alumni Awards, click here