Dr. Safiya Noble talks algorithmic bias and the future of technology

When internet studies scholar Dr. Safiya Umoja Noble googled the phrase ‘Black girls,’ she was surprised to find that the top suggested sites were pornography.

This is a flagrant example of data discrimination — the idea that algorithms and search engines can amplify social bias, rather than neutralize it. On March 3, Noble came to UBC to speak about algorithmic bias, political polarization and uncivil discourse in the US.

Noble, a professor of Gender Studies and African American studies at UCLA and Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry, was the third speaker in UBC’s Lind Initiative (Un)Civil Discourse series. Hugh Gusterson, a UBC professor of anthropology and public policy, joined her for a discussion.

She began with a keynote address about her 2018 book Algorithms of Oppression which challenges the ideas that search engines are neutral. Rather, Noble argues that automated systems intentionally or unintentionally reflect systemic discrimination to reinforce racism and misogyny, particularly impacting Black women.

When internet studies scholar Dr. Safiya Umoja Noble googled the phrase ‘Black girls,’ she was surprised to find that the top suggested sites were pornography.
When internet studies scholar Dr. Safiya Umoja Noble googled the phrase ‘Black girls,’ she was surprised to find that the top suggested sites were pornography. Courtesy Tiffany Cooper

“That’s the thread that I started with and started pulling on, and then discovered many more deeply consequential kinds of search results that demonstrated a clear racist and sexist bias against women of colour,” said Noble.

She went on to address how anti-Blackness in American history inform today’s tech landscape.

One example is predictive algorithms, which use historical datasets to predict future outcomes. However, data collection processes from healthcare to policing have historically misrepresented Black people. Using that data creates algorithms that can thoughtlessly carbon-copy the injustice of the past into the present and future.

“[Policing and healthcare are] fundamentally the most dangerous kind of predictive technology, that it really is a matter of life and death for people,” said Noble.

Noble compared the current state of Big Tech to the eras of Big Cotton and Big Tobacco in the 19th century, and how we must break the idea that “our economies depend upon these industries, that we could not imagine a place of life without them.”

Ultimately, Noble posed the idea that our current technology cannot coexist with democracy.

“I argue everywhere I go that … the new kinds of ungovernable AI and digital technology that’s coming into the marketplace is probably going to be the most important human rights issue of the 21st century,” she said.

Gusterson and Noble's discussion focused on how responsibility for engaging safety with harmful technology should not fall on users, but on regulators, politicians and technologists.

They also spoke on internet regulation, and the recent debates about if technology companies can violate citizens' civil rights in the Supreme Court’s hearings of Section 230 cases.

The talk concluded with a question about how people can begin the push back against Big Tech when it’s difficult to imagine modern life without it.

Noble argued that the stakes are too high to remain complacent, given the carbon footprint and environmental impact of training natural-language processing like ChatGPT.

“Even if you don’t care about the atrocities against people of colour, … the repeal of civil rights for women, all of the things that are happening, maybe you will care about whether your kids have a planet,” said Noble.