This week’s readings with their explicit positioning in STS or
Critical Information Studies definitely hit home for me, and actually address
many of the fundamental queries that guide my current study on the convergence
of music streaming services and mobile dating platforms. The fictions of
mechanical neutrality and “culture-less” science have come to occupy a central position
in my research, and unearthing the cultural, social, economic, and political embedded
in technological infrastructures that tout rationality and objectivity constitutes
much of my current work on Spotify algorithms and their capacity to regulate user
behavior and self-concept. But to focus specifically on the four readings for
this week, all of the authors, in one way or another, bring to our attention the
crystallization and proliferation of hegemonic discourses through the
performance of algorithmic objectivity. Considering, here, Gillepsie’s discussion
of the “recursive loop” (p. 17) in which he points out that the scope of the algorithm
is essentially limited to the archive to which it is applied, I wonder if the
algorithmic biases that Noble lays bare are, to a certain extent, not entirely
algorithmic biases, but rather somewhat reflective of actually existing available
information or “cultural algorithms” (p. 25) that are products of larger regimes
of structural oppression. To elaborate, I am wondering to what extent is the reproduction
of racist and sexist discourses online the product of existing problematic cultural
frameworks being reflected in the algorithmic architecture compared to the
actual negligence of corporations in reproducing them. To what extent is it inevitable
vs. ‘designful’? Bleakly put, can algorithms ever exist outside the agendas of White
masculine hegemony, neoliberalism, and capitalism, all of which are also tightly
bound together themselves? I’m thinking here of Gillepsie’s example:
…
a search for the phrase, “she invented,” would return the query, “did you mean ‘he
invented’?” While unsettling in its gender politics, Google’s response was completely
“correct,” explained by the sorry fact that over the entire corpus of the web,
the word “invented” is preceded by “he” much more often than “she.” Google
recognized this – and mistakenly presumed it meant the search query “she
invented” was merely a typographical error. Google, here, proves much less sexist
than we are. (p. 25)
Considering that Noble points out that Google has the capacity to “tweak”
or “fix” problematic search returns when they are exposed (p. 82), perhaps what
is desirable, then, is not the mechanical neutrality that algorithmic applications
tout, but rather, explicitly ‘partial’ ethical algorithms that mobilize their
totalizing capacity to actively counter the oppressive social and cultural discourses
they currently perpetuate…
No comments:
Post a Comment