Both the Noble
and Bucher readings this week seem to aim at a more material understanding of
the impact of the digital. Bucher offers a somewhat tangled (not to discredit
her approach, but it seems definitely less straightforward than Noble’s chapter)
engagement with the social lives of algorithms/algorithms as social life while
Noble straightforwardly argues that algorithms have disastrous material
consequences on (mostly) always-already marginalized communities.
While I do agree that there’s a certain
reciprocity between the social life of algorithms and the algorithms as social
life (as per Bucher), Noble’s piece clearly highlights the genealogy of the
uneven distribution of such a system. As Noble argues, algorithmic systems
(whether they rank, populate or recommend) are similar to more historical (some
are still simultaneous) systems and this seems to bifurcate any of the reciprocities
highlighted by Bucher. Generating uneven distributions and even create systems
of oppression for certain marginalized populations, seems to be the inherent
goal of algorithms as they are tools (among others) of the competing capitalist
market.
As someone with a similar professional training in information studies (I studied with Safiya Noble at UCLA) and who has worked with a multiplicity of information systems (in archives, in libraries and with search engines), I’ve learned and seen that these material consequences come in all form: at the UCLA special collections, for example, maybe students expressed their discomfort with some of the language used in the archive’s finding aid. While every institution can usually come up with their own unique language, most use the library of congress’ classification system. This reproduces not only the violence of the state onto marginalized subjects but also it also affects the type of research being done: as a lesbian, would I enjoy encountering the violence of my preferences being classified as a mental illness in my own institution of knowledge? While the archives propose a slower and maybe less far-reaching example than algorithms, the principle seems to remain the same: violent results yields underrepresentation.
Another question that I had while doing the reading this week was: Can a system predicated on language ever do anything else than reproduce the limitations of language itself?
As someone with a similar professional training in information studies (I studied with Safiya Noble at UCLA) and who has worked with a multiplicity of information systems (in archives, in libraries and with search engines), I’ve learned and seen that these material consequences come in all form: at the UCLA special collections, for example, maybe students expressed their discomfort with some of the language used in the archive’s finding aid. While every institution can usually come up with their own unique language, most use the library of congress’ classification system. This reproduces not only the violence of the state onto marginalized subjects but also it also affects the type of research being done: as a lesbian, would I enjoy encountering the violence of my preferences being classified as a mental illness in my own institution of knowledge? While the archives propose a slower and maybe less far-reaching example than algorithms, the principle seems to remain the same: violent results yields underrepresentation.
Another question that I had while doing the reading this week was: Can a system predicated on language ever do anything else than reproduce the limitations of language itself?
Also, I’d be interested in talking about how
this approach to the digital (a much more humanistic, borderline social scientific)
differs from the articles we read two weeks ago? What happens to the political
in both cases (because I do think that Galloway and Chun, for example, were
quite provocatively political in their polemic approach)? Is the reader
situated differently by these two approaches? What works and what doesn’t when
one decides to approach the subject of the digital from such different angles?
No comments:
Post a Comment