Transparent Bias(2.2)
- Bert Adams
- Feb 22, 2023
- 3 min read
"Even though women constitute just slightly over half of Internet users, women’s voices and perspectives are not as loud and do not have as much impact online as those of men. [This] demonstrates how some users of the Internet have more agency and can dominate the web, despite the utopian and optimistic view of the web as a socially equalizing and democratic force" (Noble 2018:58).
Society as a whole has the tendency to think of the internet, or really most technologies, as inherently neutral. A machine surely cannot have an opinion, or perpetuate bias--right? While theoretically this may be the case in some alternate reality where machines came before man; in this reality, humans create, program, teach, and disseminate technologies. And humans, as opposed to theoretical perfect machine systems, are inherently flawed. Thus, we necessarily program our flaws, our bias, our limited understanding directly into the systems we purport to enrich our lives. But whose lives are they really enriching, if they are merely a reflection of the technical elite?
As technologies advance into near ubiquity, separate technologies branch from their foundation, interact with other technologies, and ultimately heighten the sensation that if one technology is neutral, its successors must be as well. All in all, this serves as a life-size mirror; by not catching the bias in the original iterations, engineers programmed every nuance of those biases into every variation of the technology. The public naively reaps the 'reward': Systemically engrained prejudice masquerading as neutrality.
What's worse, is that once the public became aware of this implicit--and sometimes explicit--bias, the narrative turned on us. 'Problematic search recommendations? Don't blame the corporation, blame your neighbor!' This reeks of the same narrative that ignores corporate participation in climate change while placing the bulk of the blame on small actionable choices by the general public--i.e. 'If only you stopped buying plastic straws, the hole in the ozone would close.' Alphabet corp. denied any wrongdoing, shifting the blame of racist, sexist, bigoted material and recommendations onto a neutral landscape of 'transparency'.
And yet, the days of blanket immunity seem to be at an end. Yesterday, on February 21st, 2023, the Supreme Court heard a case levied against Alphabet corp.--specifically their YouTube branch. The family of Nohemi Gonzalez, who was killed in the 2015 Paris terrorist attack, allege that YouTube "aided and abetted in the attack by recommending ISIS videos to people who might be interested in them. The argument is that by recommending these videos Google promoted ISIS recruiting, propaganda and terrorist attacks," which defies America's Federal Anti-Terrorism Act (Totenburg). If the case against Alphabet corp. succeeds, Google is not the only platform that will be forced to restructure themselves. Meta, Twitter, and myriad other companies that comprise the bulk of the US economy sided with Google for the case, arguing that the backbone--the core--of what we consider internet usage to be today relies upon recommendations, trending features, up-next, and other algorithmically- or artificial-intelligence-sourced features. Some justices further argued that if this case were to prevail, the impending litigation would be non-stop as platforms would suddenly be held questionably liable for all content they host.
Something tells me that the Gonzalez's couldn't care less. What is corporate stability in the face of loss of life?
Read Totenberg's NPR article for more information on the Supreme Court case: https://www.npr.org/2023/02/21/1158628409/supreme-court-section-230-arguments
Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism, New York University Press, 2018. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/wsu/detail.action?docID=4834260.
Created from WSU on 2023-02-22
Totenberg, Nina. "No Ideological Splits, Only Worried Justices as High Court Hears Google Case," NPR, 2023-02-21. https://www.npr.org/2023/02/21/1158628409/supreme-court-section-230-arguments

Comments