Wednesday, October 2, 2019

Core Post 4: The Ease and Unease of Algorithms

It was interesting to read Noble’s points about whiteness and maleness as the dominant hegemony that configures algorithms, but my recent experience with digital technologies was actually quite different. Instead, I found myself being bothered by the utter lack of generality, and the hyper-customization of information services, which approximated my identity to a “shadow body,” as described by Gilespie (Gillespie 8).

Just yesterday, the undergraduate students in my section for Introduction to Television expressed all kinds of shock and horror at the fact that Netflix ‘targets’ its posters for content according to the racial identities of the viewers. See the relevant Guardian article here: https://www.theguardian.com/media/2018/oct/20/netflix-film-black-viewers-personalised-marketing-target.

One of the students, who identifies as a gay black male, relayed his experience about clicking on a show because the poster centrally featured someone who looked like him — but even after three seasons, the character was barely developed. Other students expressed similar distress and unease at the niche, stereotyped marketing tactic driven by customized algorithms and user data. They were outraged to be reduced to a category, to be rendered as “shadow bodies” in place of the whole of their identities. I asked students to send in screenshots of their Netflix pages so that we can explore this phenomenon further.

Reading this week’s articles made me think about the ease and unease associated with algorithms, and where that border lies. In a neoliberal world, when we use certain devices and services, we inevitably become laborers: as Gillespie notes, “it is seductive… for information providers to both track and commodify that activity in a variety of ways” (Gillespie 7). Our labor allows information systems to generate “user-caricatures” that categorize and typify us, like Netflix. But our labor is voluntary — even if we are uneasy about the system, the ease that it provides for us usually outweighs our unease.

My students are probably going to continue being bothered by the racialized posters, but also continue using Netflix. I continue to use my Echo dot to turn on my lights, even though I am aware and wary that it is always listening (Does it speak Korean?!). There certainly exists a paradox here… We know, and we fear, but we still labor for these systems. Are the systems inescapable? At what point does our unease become sufficient to resist the ease of the algorithms?

And I am thinking about all of this and writing this post in a Google Chrome Incognito window…

2 comments:

  1. Your post makes me think about Gillepsie's claim that algorithms function as a technology of the self because they allow us to visualize ourselves. It's interesting how the shadow bodies and stereotypes that try to approximate our identities might also play a part in such a constitution of the self once we become aware of them. This type of relationship seems to be an important part of Noble's project at the level of community, where racist search results work to reinforce racist presuppositions. This seems more trivial, but I wonder if we'll start thinking of the self as necessarily implicated with an uncanny shadow body.

    ReplyDelete
    Replies
    1. Adam, that's a great point. While we were busy being horrified by the algorithmic construction of our identities, we didn't think about how they might feed back into the process of self-construction. We labor for the services by providing data, which algorithms generate into "shadow bodies" — but do these "shadow bodies," in turn, influence our identities? My guess would be yes, and then the cycle repeats... So maybe it's not that "shadow bodies" replace the whole of our identities, but they actually figure into them in a complicated way. But I think that might be even more off-putting than our initial concerns!

      Delete