HTNM Revisited: Safiya Noble

15 Apr, 2019

HTNM Revisited: Safiya Noble

Recap by Tara Shi, 2018 - 2019 Graduate Liaison for the History Theory and New Media Lecture series.

Wired declared 2017 as “the year we fell out of love with algorithms.” While algorithms and their social impacts are central in international discourse today, Safiya Noble described the beginning of her scholarship late 2000’s as a climate where this technology was largely obscure to the public and defended by many academics as ‘code that can’t discriminate.’

In her lecture, Noble shared a few examples from her recently published book Algorithms of Oppression: How Search Engines Reinforce Racism (2018), preferring a shortened talk to leave plenty of room for Q&A and conversation with Abigail de Kosnik at the end. Throughout, Noble emphasized three theoretical frameworks central to her work: that search engines are linked to racist and sexist histories in the US; Black Feminism is a unique lens for understanding what results mean; and the call to stay with intersectionality.

Starting off with viral examples from Twitter, Noble examined Deray McKesson's discovery of the “n----- house” that google mapped to the White House while Obama was in office, to Kabir Ali and @bobsburgerguy google image polarized search results for black and white teenagers. Importantly, she pulls apart reactions by this tech giant, framing their apologies as ‘non-apologies’ that in fact communicate a harmful ideology—these cases are “a glitch in the perfect operating system. It’s a temporary problem that can be fixed.”

Noble moved on to connect new media with old media, drawing powerful lines between the the hypersexualized representation of women of color now and historically in the US. Specifically looking at black women and girls, the saturation of pornography that dominates the google search results for ‘black girls’ is well paralleled to antique American tropes of the ‘jezebel whore’ and ‘venus hothead’ the emerged with transatlantic slave trade and flourished throughout the Jim Crow South. For Safiya, a “‘glitch’ in the system is insufficient” in the face of this history. “30 years ago, (we) thought it was a site of liberation” said Noble. “There is no mountain of evidence except for algorithms of oppression.”

With her next examples, Noble outlined the fallout that algorithms have on the connect public, positing that search engines are “a direct threat to democracy” and are implicated as media in “suicides, genocides, mass murder, mass violence with profound impact on society.” For instance, Epstein and Robertson’s 2013 study showed that voter preference could shift substantially based on front page search results. Noble points to the flourishing of misinformation and the gamification of search engines and their facilitation of the most heinous acts. She posits that for the younger generation in particular, these advertising engines are a way to make sense of the world, often replacing teachers, libraries and vetted sources of knowledge. Looking at Dylann Roof’s manifesto as a case study, his so-called ‘racial awakening’ via anti-black red-herring search terms directly lead up to his tragic act of mass shooting in Charleston’s Emanuel African Methodist Episcopal Church in 2015.

For Noble, the United States has less of a reckoning and acknowledgment of the harmful outcomes of hate content and propaganda online than other countries like France or Germany for instance. At large, she asserts that “social inequity will not be solved by an app,” pushing instead for regulation in algorithmic technologies, and a call back on some projects with a closer look at what they mean.

In the second part, Noble responded to written questions by the audience in a conversation facilitated by Abigail De Kosnik. One of the main topics was: As students, teachers, journalists, computer scientists, policy-makers and civilians—what do we do? To this end, Noble propels a broad resistance to “prescriptive solutions”, instead encouraging the audience to “seek complex conversations and deep engagement.” The are many “technologies around an instant answer that can be quickly implemented,” she said, “but pull back. All the world’s knowledge can not be known in 0.3 seconds.” She outlines the need for multiplicity to serve as checks and balances. Cyber-racist and hate speech “gets more speech” as sensational and violent content is more valuable to advertising algorithms and heavily amplified. Noble argued to “counter hate speech with more speech.” For her, at a policy level, it’s not just about regulating big tech, but also balancing the commercial sector with democratic counterweights. While a shifting of public resources into big tech is becoming normalized in the name of ‘efficiency’ and ‘ease’, investment in public education, libraries and media like NPR and PBS, are ways of “surfacing different conversations” that answer to “different priorities and appetites.”

Another question, which Noble said was her most popular one, asked about google search and it’s basis on user interaction and engagement. She challenged the mythology that search engines are democratic and “only reflect the interest of the people.” Turning back to the example of porn industry’s dominance in the search results for terms associated with women of color, she questioned how money moves through these medias and their intersecting interests. After all, “it is an advertising platform!” Noble urges. “It’s great when you want to buy skirts, but not good for understanding race relations in the US.” While search algorithms often excel at giving us answers to the banal (how many tbsp in a cup?), it shifts our expectations and reinforces our trust in the system, obscuring its limits of knowledge building and sharing in our complex realities.