Thursday, November 7, 2019

Core Post 2


This weeks readings reminded me of a particular idealism found in and around fan studies, convergence culture, and participatory culture in the late 00s. Specifically, I am reminded of belief that operations of surveillance were once and for all being inverted, placed into the hands of the crowd, disruptive of hierarchies of power, and deployed to monitor police forces and government actors. This belief, summarized by Henry Jenkins in part here: https://youtu.be/ibJaqXVaOaI, belies the reality of contemporary surveillance, relying on now outmoded, or at least incomplete models of, surveillance processes (particularly video feeds and the readily discernable images captured from them).

Instead, as Andrejevic and Gates remind us, the contemporary landscape of surveillance strongly corresponds to the desire to capture everything with the understanding that big stores of data can be culled (supposedly) effectively through advances in computation and data analytics (185). They state “[t]he very notion of a surveillance target takes on a somewhat different meaning when surveillance relies upon mining large-scale databases: the target becomes the hidden patterns in the data rather than particular individuals or events” (190)

The lack of readily-intelligible information creates a set of problems that those who believed in the immediate inversion of surveillance techniques failed to anticipate. People do not, readily, have access to the capital, equipment, techniques, and human resource to invert this particular manifestation of the surveillance state.

Does this lack or resources, then, mark the project of surveilling the system as mere pipe dream? There are certainly processes, practices, and critiques of these new systems available, though, as of now, they appear, at least to my understanding, limited. Hito Steyerl, in “A Sea of Data: Apophenia and Pattern (Mis-)Recognition” (https://www.e-flux.com/journal/72/60480/a-sea-of-data-apophenia-and-pattern-mis-recognition/) proposes—following Benjamin Bratton--that the processes of data use in contemporary, data-driven, surveillance systems have rehabilitated or religitimized apophenia. Can these practices of pattern recognition be redeployed in resistance? Is revealing their limitations enough?

Similarly, Trevor Paglen’s work consistently puts into practice both a critique of surveillance as well as specific avenues for combatting it. In Autonomy Cube (2014), for example, Paglen created a kind of Tor relay moveable hotspot / sculpture which “can be used by others around the world to anonymize their internet use. When Autonomy Cube is installed, both the sculpture, host institution, and users become part of a privacy-oriented, volunteer run internet infrastructure.” (http://www.paglen.com/?l=work&s=cube).

The Hong Kong protests, too, have suggested that rather rudimentary masks and umbrellas can help to thwart facial recognition systems, and easy to use apps can monitor police activity (at least before they are pulled from the app store).

No comments:

Post a Comment