top of page

Surveilled Glitches (3.3)

  • Writer: Bert Adams
    Bert Adams
  • Apr 13, 2023
  • 3 min read

It is a common error to consider glitches as aberrations, momentary computational lapses, humorous snafus; yet this mindset negates the influence of designers, engineers, surveyors, testers, and management. For a glitch to have made it to the final iteration of a project, it has to have been missed by any number of people. One might be tempted to downplay the potential severity of such an infraction, but these glitches are often emblematic of larger issues at play. Consider the white team of devs who all miss a blatantly racist output: Was this coincidental oversight, or did an overarching theme, belief, or set of values tarnish the devs' ability to be flagged by the output? Evidence leans to the latter.


"As Andrew Russell and Lee Vinsel contend, '[t]o take the place of progress, ‘innovation,’ a smaller, and morally neutral, concept arose. Innovation provided a way to celebrate the accomplishments of a high-tech age without expecting too much from them in the way of moral and social improvement'" (Benjamin 2019:54).

This ideal has run rampant throughout the coffers and high-rises of Silicon Valley: Innovation in the place of Progress. Newer, bigger, faster, and above all flashier has taken the place of inherently 'better'. New tech has generally been pushed out to the benefit of the perceived majority (read: superior) and with little consideration lent to minorities. Ruha Benjamin illustrated several examples of this, not the least of which was photography favoring pale skin until woodworkers and chocolatiers complained. It turns out, the capitalistic drive to sell wares far outranks the plight of dark-skinned school children, systemically erased from their institutions by film.


And yet, once the complaints of the affluent white workers were addressed, somehow people of color still came out on the wrong side of the coin. Now able to be photographed with their lighter-skinned classmates, they found themselves at the focal point of surveillance. Benjamin details that in South Africa, the Polaroid company elicited a profit from marketing a new 'Flash Boost', enabling darker-skinned subjects to be captured photographically; and perhaps even physically. The new technology was weaponized by Apartheid, complicit with Polaroid executives, to create 'Passbooks' for black South Africans which would limit their movement to and from urban areas; and for which those without would be penalized, fined, or even imprisoned.


In the years since, every time a technological stagnation is overcome in the field of black-likeness-capturing, it has been weaponized. The affluent white community, who makes up a majority of the technological sphere, has both the propensity and the privilege of regarding such innovations as progress; overlooking, perhaps willfully, the target practice they are helping to facilitate. Black skin is finally seen by facial tracking software? Boom. Surveillance. New software is developed which prioritizes the facial features and structure of people of color? Boom. Surveillance. At every turn, innovation disguised as progress is weaponized, their pins pulled, as they are hurled into the unsuspecting crowd eagerly awaiting their fair share.


Iran is the latest in a slough of national authoritarian surveillance crack-downs. Detailed in this BBC article , Iran's government has begun installing facial recognition cameras--explicitly to surveille, warn, and eventually punish women found to be violating the country's hijab-law. Women captured by the cameras will first receive a text-alert warning them of their infraction; later, they could receive notice of a hefty fine, or their impending arrest. Last year, Mahsa Amini, a young Kurdish woman, was apprehended by authorities for violation of the nation's dress-code; when she died in police custody, thousands of protests erupted across the country. Just last week, a man slammed a bowl of yogurt onto the heads of two young, unveilled women; who were subsequently arrested themselves. We would do well to remember that those who are the most heavily surveilled, are often those perceived as a threat by merit of who and what they are; rather than what they have done. The slippery slope of preemptive policing leads necessarily to unjust bias, and codification of our ugliest traits: Our Fears.


Benjamin, Ruha. Race after Technology : Abolitionist Tools for the New Jim Code, Polity Press, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/wsu/detail.action?docID=5820427.

 
 
 

Comments


(425) 591-2555

©2023 by Bert Adams. Proudly created with Wix.com

bottom of page