Synthesis Report
- Bert Adams
- Apr 27, 2023
- 4 min read
“Any object created by human hands is actually a physical manifestation of generations of conscientious human experience within a Native homeland.” -Melissa Duarte (Network Sovereignty, 27)
In Algorithms of Oppression, Safiya Noble illuminated the continuity of blame-rhetoric, showing that the American government has always blamed victims for their woes. According to Duarte, when indigenous peoples were hungry, the government blamed them for being lazy–when in actuality, they had been banned from their fishing spots and other locations of food gathering and procurement. To Noble, when African Americans complained of racist search results online, society-at-large was blamed for their own search histories–negating the years of coding and algorithmic progress that had produced the inflammatory results.
Duarte placed great emphasis on the responsibility of professionals to build products with minority clients in mind. She claimed that instead of asking what factors limit access, we ought to instead innovate around and for those limitations (Network Sovereignty, 89). In the case of tribal digital accessibility, this comes down to engineers, hardware production, and ICT leadership; but for systemic racial inequities encoded into generation upon generation of mindset and technology alike–those inequities having become the manifestations of “generations of conscientious human experience”–sites and applications built specifically with people of color in mind feel like more of the same. ‘Separate, but equal.’ This is just glorified segregation, gilded with promises, false hope, and gold stars.
In Race After Technology, Ruha Benjamin extrapolates on those digital inequities. She shows that the Algorithmic Oppression Noble illustrated is encoded into far more than closed-door meetings; but rather, that it is a continuation of a cyclical system ingrained into our very language and culture. The etymology of the term ‘robot’ is from the Slav, ‘robota’, meaning ‘forced labor’. The algorithms that shape our daily digital experience are molded both realistically and idealistically off of conceptual oppression; even our science fiction, a realm which ought to harbor our loftiest technological aspirations, is rooted in such problematic terminology (Race After Technology, 55).
Benjamin goes on to further demonstrate the extent to which historical processes make the present possible–echoing the sentiments of Duarte–by illustrating the continuity of historic surveillance and policing practices. From slave patrols to militias to precincts, we are living amidst the “physical manifestation of generations of conscientious human experience within a Native Homeland.” Just as indigenous wisdom is rooted in part in the natural phenomenon of their locale, our society’s bigotry is rooted within our history, repeated in every possible iteration, and amplified by the technologies we purport to herald our cultural progress.
Where Noble insinuates that representation may be enough to turn the table–Benjamin insists that these problematic algorithms and results are far from glitches. She explains that even with a seat at that table, one may not have the authority or acumen to rectify the situation; that what the corporate world is wont to write off as a blip has deep-rooted tendrils in larger, systemic, societal issues; and that these ‘glitches’ will continue to appear until we address those issues as a culture.
One of the ways that we might address the topic, is to reconcile once and for all that innovation is not always progress; what seems helpful and affluent to some, may be predatory and oppressive to others. Duarte explained that our institutions are crystallizations of our communication methodologies (Network Sovereignty, 27). With infrastructures being built out with the white, affluent, American Dream in mind–rather than with the myriad minorities that comprise a large portion of the consumer base–it is no small wonder that minorities are likewise omitted from our institutions. Without the contributions of African American, Indigenous, and immigrant communities, racist communications became racist institutions; and the cycle of oppression continues. With each innovative step towards a technologically advanced society, someone is left out, overtly or covertly stepped on, or outright targeted, as is the case with facial recognition and other surveillance technologies.
In Washington DC, the consequences of unfettered surveillance have been covertly unfurling for decades. Where tribes struggle to be recognized, African American communities drown in over-recognition; surrounded by instruments of surveillance, yet still misidentified by the misleadingly faulty tech. It has been statistically proven, time and time again, the high levels of misidentification rampant among facial recognition technologies; yet their use, rather than curtailed, is amplified. In the aftermath of 9/11, DC Police orchestrated a Joint Operations Command Center to track terrorist movements; a wall of TVs revealed hundreds upon hundreds of the city’s live feeds. Predictably, in a matter of years the department had elected to revitalize the operation towards tracking and monitoring the city’s protesters. In 2020, a selection of the JOCC’s emails were leaked, finally unveiling their existence to the citizens they had been surveilling for twenty years. In that time, the system had ‘progressed’; hundreds more of the city’s live feeds were available, and artificial intelligence had been implemented to correlate feeds with people of interest, supposed gang territories, suspected gang members, and predictive crime maps. Unsurprisingly, the JOCC’s datasets were vastly inaccurate; such that even if facial recognition software were less prone to mistakes, the system would criminalize and track innocent civilians (Chris Geraldi, The Intercept, Jun. 18, 2022). Duarte’s historic continuity, Noble’s algorithmic oppression, and Benjamin’s continuous historic surveillance, had combined to reveal a beast that once unleashed, was impossible to reign back in.
The culture of surveillance has been rampant in our nation since its fruition; distrust has characterized our society to a point of no return. From the earliest colonizers, through Jim Crow to present-day, Americans have sought to surveille the nefarious ‘other’; this has crystallized into a form of distrust that is seeded so deeply within our institutions that we continuously stratify even as we innovate towards inclusion. With every technological innovation, we deepen the divide between Silicon Valley and the world; between urban and rural communities; between skin colors. The ‘haves’ and the ‘have-nots’ grow increasingly distanced, such that the ‘have-nots’ are all but forgotten, no longer innovated for, written off as objectionable outliers.

Comments