Center for Ethics, Society, and Computing Events


The Center for Ethics, Society, and Computing (ESC – pronounced “escape”) is dedicated to intervening when digital media and computing technologies reproduce inequality, exclusion, corruption, deception, racism, or sexism. ESC is a research center and a collective of scholars committed to feminist, justice-focused, inclusive, and interdisciplinary approaches to computing.

Organizer:  Christian Sandvig

 

CRITICAL x DESIGN

Digitally Divided: The Art of Algorithmic (In)Decision

Katherine Behar
March 20, 2019 | 3:00pm to 4:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Abstract  
In “Digitally Divided,” Behar presents her artwork with a focus on how algorithms dismantle and rearrange us. Across culture, algorithms have been unleashed to allocate complex systems into manageable portions. They mete out standardization and suppress idiosyncrasy across diverse and defiant populations of human and nonhuman objects, in ways that are socially, technically, and conceptually reductive. This lecture brings together examples of Behar’s videos, interactive installations, sculptures, and performances, alongside episodes from media history and popular culture to explore this core notion of being “digitally divided.”

Less Metrics, More Rando: (Net) Art as Software Research

Ben Grosser
March 27, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Abstract  
How are numbers on Facebook changing what we "like" and who we "friend"? Why does a bit of nonsense sent via email scare both your mom and the NSA? What makes someone mad when they learn Google can't see where they stand? From net art to robotics to supercuts to e-lit, Ben Grosser will discuss several artworks that illustrate his methods for investigating the culture of software.

Old, Raw, or New: A (New?) Deal for the Digital Age

Joy Lisi Rankin
April 11, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Abstract  
American historians debate whether Franklin Delano Roosevelt’s Depression-era legislation was, in fact, a New Deal, or perhaps an “Old Deal” or a “Raw Deal.” Considering multiple perspectives and voices, combined with the long sweep of history, stokes this lively, ongoing debate. In this CRITICALxDESIGN talk, I’ll turn my attention to American computing in the 1960s and 1970s to consider whether the academic networks of that era may be inspiration for a Digital New Deal. The users of 1960s and 1970s academic computing networks built, accessed, and participated in cooperative digital commons, developing now-quotidian practices of personal computing and social media. In the process, they became what I call “computing citizens.” I’ll use several case studies to illustrate the dynamic - and unexpected - relationships among gender, community, computing, and citizenship, including the Old Deals and the Raw Deals of computing citizenship. How might these computing citizens inform crucial contemporary debates about technology and justice?

Apparatuses of recognition: Google, Project Maven, and targeted killing

Lucy Suchman
April 19, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Abstract  
In June of 2018, following a campaign initiated by activist employees within the company, Google announced its intention not to renew a US Defense Department contract for Project Maven, an initiative to automate the identification of military targets based on drone video footage. Defendants of the program argued that that it would increase the efficiency and effectiveness of US drone operations, not least by enabling more accurate recognition of those who are the program’s legitimate targets and, by implication, sparing the lives of noncombatants. But this promise begs a more fundamental question: What relations of reciprocal familiarity does recognition presuppose? And in the absence of those relations, what schemas of categorization inform our readings of the Other?

The focus of a growing body of scholarship, this question haunts not only US military operations but an expanding array of technologies of social sorting. Understood as apparatuses of recognition (Barad 2007: 171), Project Maven and the US program of targeted killing are implicated in perpetuating the very architectures of enmity that they take as their necessitating conditions. I close with some thoughts on how we might interrupt the workings of these apparatuses, in the service of wider movements for social justice.

 

Ethics & Politics of AI talks

Engaging Discourse and Justice in a Datafied World

Anna Lauren Hoffman
April 22, 2019 | 3:00pm to 4:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Abstract  
Values of fairness, antidiscrimination, and inclusion occupy a central place in the emerging ethics of data and algorithms. Their importance is underscored by the reality that data-intensive, algorithmically-mediated decision systems—as represented by artificial intelligence and machine learning (AI/ML)—can exacerbate existing (or generate new) injustices, worsening already problematic distributions of rights, opportunities, and wealth. At the same time, critics of certain “fair” or “inclusive” approaches to the design and implementation of these systems have illustrated their limits, pointing to problems with reductive or overly technical definitions of fairness or a general inability to appropriately address representative or dignitary harms.

In this talk, I extend these critiques by focusing on problems of cultural and discursive violence. I begin by discussing trends in AI/ML fairness and inclusion discussion that mirror problematic tendencies from legal antidiscrimination discourses. From there, I introduce “data violence” as a response to these trends. In particular, I lay out the discursive bases of data-based violence—that is, the discursive forms by which competing voices and various “fair” or “inclusive” solutions become legible (and others marginalized or ignored). In doing so, I undermine any neat or easy distinction between the presence of violence and its absence—rather, our sense of fair or inclusive conditions contain and feed the possibility of violent ones. I conclude by echoing feminist political philosopher Serene Khader’s call to move away from justice-imposing solutions toward justice-enhancing ones. Importantly, justice-enhancing efforts cannot simply be a matter of protecting or “including” vulnerable others, but must also attend to discourses and norms that generate asymmetrical vulnerabilities to violence in the first place.

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media

Tarleton Gillespie
April 25, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Abstract  
Content moderation can serve as a prism for examining what platforms are, and how they subtly torque public life. Our understanding of platforms too blithely accepted the terms in which they were sold and celebrated - open, impartial, connective, progressive, transformative - skewing our study of social behavior that happens on them, stunting our examination of their societal impact.

Content moderation doesn’t fit this celebratory vision. As such, it has often been treated as peripheral to what they do—a custodial task, like sweeping up, occasional and invisible. What if moderation is in fact central to what platforms do? Moderation is an enormous part of the work of running a platform, in terms of people, time, and cost. The work of policing all this caustic content and abuse haunts platforms, and profoundly shapes how they work.

Today, social media platforms are being scrutinized in the press; specific controversies, each a tiny crisis of trust, have gelled into a more profound interrogation of their responsibilities to users and society. What are the implications of the emerging demand that platforms serve not as conduits or arbiters, but as custodians? This is uncharted territory for the platforms, a very different notion of how they should earn the trust of their users and stand accountable to civil society.