FEBRUARY 13, 2018
INFRASTRUCTURE IS CREATED by people and therefore embeds and reflects the values of the people who create it. This is a fundamental insight in the study of what information studies scholar Susan Leigh Star has called “boring things”: phone books, medical coding manuals, the Dewey Decimal System. Such systems are also usually invisible as long as they work. We rarely think about sewer pipes unless they’re backing up into our houses, and the patterns of air traffic don’t matter unless they’re disrupted by weather.
Of course, what is felt as a disruption changes depending on the social and political position occupied by a given person. Infrastructure does not work equally well for each of us. Think of Robert Moses and his transformation of New York City through his network of expressways. The roadways work quite well for elites moving in, through, and past the city in their cars. But the poor and working-class people of color cut off and isolated from the rest of the city by a swoop of the Brooklyn–Queens Expressway can tell you that Moses’s roads are broken. Infrastructure is a human thing and thus a political thing. Read critically, such systems reveal the ways that power and privilege are normalized such that they extend and consolidate patriarchy, white supremacy, and wealth inequality. More often, such infrastructures are left unexamined. They facilitate normal life, and the inequities that are sustained by them are not seen at all. It’s hard to get from Red Hook, Brooklyn, to Manhattan. That’s just how it is, how it has always been, and there’s nothing political about it. It’s just a matter of the road. Such systems are insidious because they are substrate, by definition sitting underneath the world as we experience it.
Safiya Umoja Noble addresses internet search as one such critical infrastructure in her book, Algorithms of Oppression. Her target is the internet, that structuring machine of everyday life. From its early days connecting Department of Defense computers to each other, the internet has morphed for many of us into an extension of our minds and selves. We carry smartphones with the computing power of a desktop in our hands, along with the promise of a flattening of the social order. The internet is a tool of democracy, after all. Gone are the gatekeepers that allow only sanctioned voices into the public dialogue. In their place, we’re promised, untold Arab Springs will bloom.
Noble contests this fantasy of the internet as equalizing device. Rather than focus on what it facilitates, she explores the internet as infrastructure, investigating what is hidden from view by mathematical algorithm. Noble argues instead that the web is instead a machine of oppression, a set of “digital decisions” that “reinforce oppressive social relationships and enact new modes of racial profiling.” The internet is not a magic box spitting out facts about Donald Trump, ex-girlfriends, and the history of Algonquian fishing weirs. Code is power, and it is white and male.
The way that power is wielded online is acutely familiar, even as digital tools hold the promise of the new. The genesis of Noble’s project emerges from a very ordinary moment online. In the fall of 2010, Noble sat at her computer, looking for “things on the Internet that might be interesting to my stepdaughter and nieces.” When she Googles “black girls,” she finds instead HotBlackPussy.com. Google’s retrieval mechanism is not interested in what might be good or true or necessary for an audience of actual black girls, curious about themselves and their world. Black girls matter only for the role they play in the racist and misogynist fantasies of Google’s majority client: the white American man. Not so coincidentally, suggests Noble, this is the demographic most likely hired by Google to build their algorithm in the first place.
Noble’s central insight — that nothing about internet search and retrieval is politically neutral — is made again and again through the accumulation of alarming and disturbing examples. Image searches for “gorilla” turn up photos of African-American people. Looking for “black teenagers” returns police mug shots. Searching “professional hairstyles” returns images of white women wearing ponytails and French braids while “unprofessional hairstyles” features black women. The story told about black people online is almost entirely refracted through a white racist lens. What she surfaces online parallels extended histories of racist white representations of blackness, and black femininity in particular. The pornification of black women on the web bears echoes of Sarah Baartman’s exploitation in the 19th century, argues Noble. For all its innovation and disruption, Silicon Valley simply repeats very old racist stories.
It would be enough if search simply made the internet inhospitable to African-American women, but Noble makes a compelling case that pervasive racism online inflames racist violence IRL. In a chapter on the Dylann Roof shootings in Charleston, South Carolina, Noble describes the beginning of Roof’s radicalization: he “allegedly typed ‘black on White crime’ in a Google search to make sense of the news reporting on Trayvon Martin, a young African American teenager who was killed and whose killer, George Zimmerman, was acquitted of murder.” What Google retrieved for Roof — a vast trove of white supremacist fantasy about black-on-white crime, was instrumental in Roof’s decision to enter a church and murder nine people. There is nothing benign about encoding white supremacy in Google’s search algorithm.
Having turned the “boring thing” of algorithmic code into a site of political and cultural analysis, Noble turns to potential prescriptions. How might Google respond to the concerns raised by Noble and others about the racism and misogyny embedded in its networks? Google has acted in several cases. As she notes, the algorithm has been changed to remove pornography from the first set of results when users Google “black girls.” Google also removes anti-Semitic content from the web in response to hate speech laws in Germany, and complies with Right to Be Forgotten laws in Europe more broadly. The internet as it is retrieved by Google could be different. As a former urban marketing executive whose job was in part to insulate companies against potential racist missteps, Noble is keen to the ways that Google responds to accusations of racism in its algorithm. And yet, racist content persists.
Noble offers two solutions. The first is a call for Google and other Silicon Valley companies whose code invisibly structures so much of contemporary life to hire people who understand how race and gender and other categories of social difference function in the world to produce different life experiences for different people. If code is invariably created by only one kind of person — usually white, usually male, with a worldview so thoroughly aligned with the forces of dominant power that he can’t see that he has any power at all — the code will always fail to account for minority and minoritizing perspectives. The problem is not just that racist search results are retrieved by Google, but that the people who make Google don’t anticipate that such results will appear at all, and therefore don’t account for them in advance. Indeed, the numbers at Google are stark: in 2016, only two percent of its workforce was African-American, and only three percent were Latino. For Noble, the dominant whiteness at the heart of technology companies leads to a host of other algorithmically driven problems, from racist Twitter trolls and SnapChat filters to sharing platforms like Uber and Airbnb that facilitate discrimination based on race. Coders need critical race theorists, suggests Noble, or at least workers who understand that frictionless digital infrastructures aren’t frictionless for everyone. Diversifying the technological workforce is an important first step toward building an internet that accounts for power and privilege at the level of code itself.
Her second solution is an appeal to the state in service to the public good. It is refreshing to see a call to state power amid the frenzy of deregulation that has accompanied the Trump administration. Rollbacks of administrative rules have proceeded with speed as Trump largely makes good on his campaign promise to be the most deregulating president in history. By December 2017, his administration had withdrawn or delayed nearly 1,600 individual regulations. Many of these represent attacks on clear public goods like clean air and clean water, and, some argue, attacks on the internet. Among the most public of these fights has been the one around net neutrality. Advocates of net neutrality argue that government regulation of commercial internet companies is necessary in order to ensure that these entities don’t prioritize access to some content over others. The internet is a public utility like the telegraph and telephone that came before it, and should be regulated the same way. Under Trump, FCC chairman Ajit Pai has disagreed, advocating for an end to regulation in order to “restore Internet Freedom.” The argument, of course, is: Freedom for whom? As Noble makes clear, the internet is only “free and unfettered” if racism or sexism are not problems for you. Government regulation evens playing fields and makes things fair.
Noble makes a convincing case that despite its status as a publicly held corporation, Google functions much like any other public utility. Its use is ubiquitous. Google has become synonymous with “looking things up on the internet,” and its suite of services have locked in many of us willing to trade a little personal privacy and marketable data in exchange for free email and a standard battery of web-based software essential to the contemporary professional life. In the course of writing this short review I have checked my Gmail dozens of times, added appointments to my Google Calendar, shared drafts with colleagues via Google Docs, and Googled myriad things. For many of us, Google is a wraparound company. Noble argues that a company that plays this kind of role in our lives must be subject to the checks of government regulation. Without that, we have no recourse except to hope that Google abides by its founding motto: Don’t be evil. Given Noble’s research, we have plenty of evidence that the company cannot.
What emerges from these pages is the sense that Google’s algorithms of oppression comprise just one of the hidden infrastructures that govern our daily lives, and that the others are likely just as hard-coded with white supremacy and misogyny as the one that Noble explores. How might we understand other boring things — our subway systems, tax codes, mortgage rules — as sedimentations of power and privilege, and what must we do to change them? If, as Noble suggests, regulation by an agent of the public good is a necessary counterweight to corporate exploitation of legacies of racism, the question becomes how we can make the state that agent. White supremacy and patriarchy are foundational to the government too, after all. Organizing toward that end will surely involve building our own infrastructures, and that is the code that must be written next.
Emily Drabinski is associate professor and coordinator of library instruction at Long Island University, Brooklyn. She is editor of Gender and Sexuality in Information Studies, a book series from Library Juice Press/Litwin Books.