Benevolent Fixes (3.4)
- Bert Adams
- Apr 19, 2023
- 4 min read
In our modern framework of technological society, we are confronted with a staggering amount of "fixes" for the stratification that very technology amplifies. At this point, we may be idealistically aware that algorithms breed the biases of their creators; and maybe even that no technology is truly neutral. But with that knowledge in the foreground, we would be utterly remiss, foolish even, to insist that the self-same systems which bred the codified inequity we currently combat, will also magically be the ones to spirit us away from their pitfalls.
AI is being increasingly employed by hirers, lenders, and captors alike (along with a seemingly limitless slew of other sectors); and it confounds me, that the nefarious link between the three is not more commonly discussed. Systemic unemployment breeds systemic poverty breeds systemic incarceration, ad infinitum. "If it ain't broke, don't fix it," as the old adage goes; and for hirers, lenders, and captors, the system certainly "ain't broke". This brings us to the monetization factor: For Profit.
Hirers and lenders hold an evident relationship with monetization: Hiring an individual necessitates paying them, and lending to an individual necessitates the expectation of repayment. But when the societal system infringes upon an individual's ability to gain employment, it sets off a cascading avalanche of limiting factors. How is one to get a loan without proof of able repayment? How is one to be hired without the loan necessary for proper lodging, including mailing address and hygienic maintenance? How is one to eat without employment? This cyclical process is well documented, and one that squarely lands many of its participants in the position of desperation, often leading to committed crimes. But when the captors profit from incarcerated bodies, the cycle becomes something tantamount to psychological torture.
AI utilized for hiring-screening systemically draws from the codified biases of past generations; Ruha Benjamin details that current AI-hiring systems wrack up immense inferred data-sets off of a few simple data-points: First name, Last name, and Zip Code. Even with a generically "white" name, someone living in a predominantly black neighborhood may be coded as "black", and reap the withering rewards of such an assumption (Benjamin 2018:99). But this would only be a small glance, a previewing peek, at the daily hardships faced by an ethnically diverse American populace. Merely living in a not-yet-gentrified neighborhood is enough to get marks against one's self; if you are white, just imagine the cyclone of problematic limits that would now be garnered against you if you simply were in the possession of a fractional amount of melanin. Denied lodging, denied employment, and now somehow when you commit a minor infraction--marijuana possession for instance, or jaywalking--not only are you arrested, but someone is profiting from your incarceration. The very money you were denied, lining the pockets of some faceless white man, hell-bent on propagating slave patrols under the guise of "blue lives matter". As if Policemen were born in uniform, discriminated against for their policeman heritage, gate-kept from employment due to their law-enforcement background, and couldn't simply take the uniform off to "pass" as a plain-clothes civilian. Imagine how many incarcerated people of color must look at their captors' uniforms with disdain; at their heir of persecution for a small taste of what it is like to be affiliated in-whole with a color. "It must be hard," they'd say, "to be judged for the color of your clothes."
To make matters worse, with the rising overcrowding of prisons (say, maybe if we decriminalized racially coded petty misdemeanors such as loitering, jaywalking, and minor possession, prisons would have more room?) the For Profit captor system has ensconced a new method of "mass monitorization rather than mass incarceration" (Benjamin 2018:95) via Electronic Monitoring (EM) devices. According to Benjamin,
"Most EM is being used in pre-trial release programs for those who cannot afford bail, employing GPS to track individuals at all times – a newfangled form of incarceration before conviction...The very solutions to mass incarceration and prison overcrowding, in other words, give rise to innovative forms of injustice. They are, in short, racial fixes that harm even as they purport to help." (Benjamin 2018:96).
Surveillance is nothing new to African Americans; but monetization of monitorization certainly places a familiar bitterness onto the whole affair. There is no way to un-conflate the current practices of racialized incarceration with the Civil-War-era slave patrols that birthed our very concept of policing. Taking it back even further, there is no way to un-conflate these current practices, or those older practices, with the bounty the US government historically placed on the scalps of indigenous people. In fact, that word rather sheds a bit of light on these incarceration practices: Bounty. When incarceration leads to profit, when quotas must be met, how are the captor's mindsets not to be transported to the framework of some sort of deadly game? The highly-contested Stanford Prison Experiment illustrated irrefutably the sort of mentality shifts evident in the minds of those in positions of power. Problematic as the experiment may have been, its lessons are unignorable. Give a man a baton and a bit of metaphorical height over another man, and evidently both will break. This is the state of our incarceration-system; the main difference is merely the officers' ability to turn their body-cams off.
Benjamin, Ruha. Race after Technology : Abolitionist Tools for the New Jim Code, Polity Press, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/wsu/detail.action?docID=5820427.

Comments