Thursday, October 24, 2019

Core Post: Starbucks, Dirty Data, Algorithms, Dirty Curves, and More !


A few weeks ago, I visited a friend in Santa Barbara. Upon stopping at an ice cream shop, we took a nice stroll through the downtown area. However, our evening soon took an unsettling turn as we walked by Starbucks. Suddenly, a Black man in his twenties or early thirties hastily exited the building, informing us that the employees called the police on him. Recalling recent events such as the harassment and wrongful arrest of two Black men in a Philadelphia Starbucks and the vicious profiling of a Black Muslim man who received his coffee with “ISIS” printed on the cup, we entered the store.

The two White women workers who called the cops claimed the guest was “erratic” and “aggressive”; however, they cited an “example” of him appearing to be asleep while looking down at his phone. After we immediately pointed out the recent events involving the harassment of Black guests in Starbucks, the workers apologized to my friend and I, rather than apologizing to the guy for whom they wrongfully contacted the police. We contacted corporate via phone, e-mail, and a live online chat service.

Having recently experienced several instances of profiling, Rashida Richardson, Jason M. Schultz, and Kate Crawford’s “Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice” stood out me in this week’s readings. While the Safiya Umoja Noble reading from our last class meeting dealt with issues surrounding how the biases of software engineers shape the algorithm’s representation of oppressed groups, the Richardson et al. paper similarly featured a timely discussion on how dirty data translates into predictive policing programs, thus disproportionately affecting “overly policed” and excessively criminalized groups.[1] Interestingly, amidst the aforementioned experiences and discussions on how software engineers and the reporters of “dirty data” negatively impact oppressed groups, the live online chat personnel were not as receptive or understanding of the situation when my friend communicated via this outlet.

Taking it home . . .

In a discussion on “dirty data” in the Richardson et al. study, the authors note how “…even calling this information ‘data’ could be considered a misnomer, since ‘data’ implies some type of consistent scientific measure or approach.”[2] Moreover, the authors report how the New Orleans Police Department has refused to confront certain types of complaints regarding “discriminatory policing.”[3] Interestingly, in my blog post on October 3, 2019, I mentioned how the Department refuses to act upon certain situations (specified in the previous post) in a meaningful way despite several student complaints over the years. Additionally, despite various racially insensitive teachings and incidents, TAs are expected to grade “according to the expectations of the instructor”[4] and certain courses have what is in theory (and even stated in manuals) a secret (perhaps "dirty") distribution curve (no formula) that resembles the Richardson et al.’s observed phenomenon of dirty data serving as a misnomer.

While it’s great to have discussions on these issues, as I stressed in my Queer OS presentation, it’s interesting to integrate it into practice.

--Kam


[1] Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York, NY: NYU Press, 2018); Rashida Richardson, Jason Schultz, and Kate Crawford, “Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice,” New York University Law Review, 2019, 197.
[2] Richardson, Schultz, and Crawford, “Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice,” 199.
[3] Richardson, Schultz, and Crawford, 212.
[4] Teaching Assistant Guide, 2019-2020 (Los Angeles, CA: Division of Cinema and Media Studies, School of Cinematic Arts, University of Southern California, 2019), 23.

No comments:

Post a Comment