top of page

Robotic Racism (3.2)

  • Writer: Bert Adams
    Bert Adams
  • Apr 3, 2023
  • 3 min read
[The term "stereotype"] first referred to a practice in the printing trade whereby a solid plate called a “stereo” (from the ancient Greek adjective stereos , “firm,” “solid”) was used to make copies. The duplicate was called a “stereotype.” The term evolved; in 1850 it designated an “image perpetuated without change” and in 1922 was taken up in its contemporary iteration, to refer to shorthand attributes and beliefs about different groups. - Ruha Benjamin, Race After Technology

In an age when we, as a population, are moving rapidly towards automation, the question is often raised of how to make machine learning somehow devoid of human error. In theory, it sounds plausible. Machines, with their perfect logic and enough coding, should be capable of algorithmic decision-making without human tampering; but the issue lies with that 'enough coding' clause. On the starting end of an algorithm, a robot, a technology, any system, there will have been a human, or team thereof, subconsciously codifying every pixel of their own biases. What's more dangerous still, is that these human biases are now viewed through the lens of as if they were opinion-null facts, merely data spit out by an automation; as if they were colorblind. Racially coded stereotypes hidden under the guise of statistics have made their way past our moral firewall, and they have systemically corrupted the inner workings of our society and infrastructures.


Even beginning our discussion with something as innocent as the names we are given, and give others in return, we immediately find a harsh ceiling. Data points linked to names stick with individuals entirely independently of one another: Ethnicity, age-range, immigration-status, gender, and criminality are just a few of the top contenders for limiting factors one's name might provide them; regardless of their merit or applicability, and before our face is ever seen. Imagine, just by filling out one's name correctly on the test, having already failed.


And yet, the level of care we seem to get in return from large corporations such as Facebook, is the slap-in-the-face that was their original motto: "Move Fast and Break Things", or, "if you never break anything, you're probably not moving fast enough." I can see where they were coming from: The innovative young bucks in Silicon Valley, here to smash expectations! But in moving so very fast--and in knowingly breaking things in the process--they provided themselves with a sort of make-shift absolution from the harm their innovative processes may have caused. They certainly broke things--just not any glass ceilings.


Benjamin details in Race After Technology that a robot is "any machine that can perform a task, simple or complex, directed by humans or programmed to operate automatically;" and that robots have long been a way to not only speak of dehumanization, but to not speak of racialization. The term itself stems from a Czech term meaning "compulsory service"; which in turn stems from the Slav "robota", meaning "servitude, hardship". As much as media portrayals such as The Jetsons would have us thinking about robots as our one-way-ticket to the future, their very thematic origins are steeped in racism. Some of the earliest adverts predicting the rise of robotics coined them as "slaves" that "we" would have "again"; making it clear from the start, who the market clientele should be.


What was unclear, however, was that this meant these "robots"--technologies, systems, surveillance, infrastructure, automation, credit, citizen scoring, ad infinitum--would not only slave for some...but against others. Those left out of the original adverts, were unlikely to receive any truly fungible service from these robots, instead having their stereotypes codified against them, weaponized, publicized, institutionalized, cemented, and reinforced. It seems that the slavery system, the racial caste system, is alive and well; simply digitized.


Benjamin, Ruha. Race After Technology : Abolitionist Tools for the New Jim Code, Polity Press, 2019.

 
 
 

Comments


(425) 591-2555

©2023 by Bert Adams. Proudly created with Wix.com

bottom of page