Center for Ethics, Society, and Computing Events

The Center for Ethics, Society, and Computing (ESC – pronounced “escape”) is dedicated to intervening when digital media and computing technologies reproduce inequality, exclusion, corruption, deception, racism, or sexism. ESC is a research center and a collective of scholars committed to feminist, justice-focused, inclusive, and interdisciplinary approaches to computing.

Organizer:  Christian Sandvig


Upcoming Events

ESC PLAN: The ESC Opening Event

A half-day public event to inaugurate the Center for Ethics, Society, and Computing.
More information is on the event details page on the Web:


Friday, January 24, 2020 from 1:00 pm to 7:30 pm Eastern Standard Time (UTC/GMT-5).

Space 2435, North Quadrangle, 105 South State Street, Ann Arbor, MI 48109 (Off-site reception to follow.)


Video from this talk will be streamed live at An RSVP is requested for in-person participants. RSVP form:


1:00 p.m. – 1:15 p.m.
Toward Ethics, Society, and Computing (ESC)
(Opening Remarks.)

1:15 p.m. – 2:45 p.m.
PANEL: Accountable Technology — An Oxymoron?
Julia Angwin, The Markup
danah boyd, Data & Society
Marc DaCosta, Enigma
Jen Gennai, Google
Christian Sandvig, University of Michigan (moderator)

3:15 p.m. – 4:45 p.m.
PANEL: Culture After Tech Culture — Unimaginable?
André Brock, Georgia Tech
Silvia Lindtner, University of Michigan (moderator)
Holly Okonkwo, Purdue University
Monroe Price, University of Pennsylvania
Shobita Parthasarathy, University of Michigan

4:45 p.m. – 5:00 p.m.
The Future of Ethics, Society, and Computing (ESC)
(Closing Remarks.)

5:30 p.m. – 7:30 p.m.
Reception and Mixer
The Circ Bar, 2nd Floor Private Lounge
210 South First Street, Ann Arbor, MI

An RSVP is requested for in-person participants. RSVP form:

More information is on the event details page on the Web:


Past Events

Chris Calabrese: Show Your Face?

November 20 | 4:00 pm - 5:20 pm
Weill Hall, Annenberg Auditorium
Additional information

Facial recognition technology is sweeping into our public and private lives. The government is deploying it at the border and throughout law enforcement investigations. Technology companies are building it into their social networks. Employers are using it to monitor movements and productivity. As the technology becomes increasingly powerful, accurate, and versatile, it’s raising more and more privacy and civil liberties concerns, especially for marginalized or vulnerable populations. Christopher Calabrese, Vice President for Policy at the Center for Democracy & Technology will discuss the pros and cons of facial recognition technology, how it is changing many aspects of our lives, and how policymakers should address it.

Rayid Ghani: Data Science for the Next Ten Years

November 14 | 8:45 am - 10:00 am
Ehrlicher Room, 3100 North Quad
Additional information

Rayid Ghani was Chief Scientist of 2012 Obama Campaign. He is presently Distinguished Career Professor in Machine Learning at Carnegie Mellon University.

Tina Eliassi-Rad: Just Machine Learning

November 14 | 3:30 pm - 4:30 pm
Ehrlicher Room, 3100 North Quad
Additional information

In this talk, I will discuss current tasks, experiences, and performance measures as they pertain to fairness in machine learning. The most popular task thus far has been risk assessment. Most human decision-makers seem to use risk estimates for efficiency purposes and not to make fairer decisions. The task of risk assessment seems to enable efficiency instead of fairness. I will present an alternative task definition whose goal is to provide more context to the human decision-maker. I will discuss our null model for fairness and demonstrate how to use deviations from this null model to measure favoritism and prejudice in data.

Megan Finn: We Are All Well

November 11 | 4:30 pm - 5:30 pm
Ehrlicher Room, 3100 North Quad
Additional information

When an earthquake happens in California today, residents may turn to Twitter for government bulletins and the latest news, check Facebook for updates from friends and family, look to the United States Geological Survey (USGS) for online maps that show the quake's epicenter, and hope to count on help from the Federal Emergency Management Agency (FEMA). This information order articulates a particular epistemic experience of earthquake...

ESC POD: Faculty Mixer

November 1 | 4:30 pm - 6:30 pm
A mixer for Michigan faculty interested in ESC to be held at "Hathaway’s Hideaway," a 1901 ward meeting meeting hall redecorated with bar and restaurant furnishings from establishments that are significant in the history of Ann Arbor.
Additional information

Aynne Kokas: From Grindr to Cybersovereignty

October 29 | 12:00 pm - 1:00 pm
110 Weiser Hall
Additional information

The Chinese government has become increasingly involved in global standards-making events such as the annual Internet Governance Forum and China’s Wuzhen Internet Summit (aka the World Internet Conference) that leverage China’s national standing in international standards-building events to shape global the future of global Internet governance. At the same time, Chinese regulators are also exporting standards not through national, or international governance frameworks, but through the community standards of individual platforms. This talk examines how the Chinese government is expanding its regulatory control over global consumer platforms through the expansion of Chinese-owned consumer platforms.

ESC POD: Ph.D. Student Mixer

October 18 | 4:30 pm - 6:30 pm
Hathaway’s Hideaway | 310 S. Ashley Street
A mixer for Michigan Ph.D. students interested in ESC to be held at "Hathaway’s Hideaway," a 1901 ward meeting meeting hall redecorated with bar and restaurant furnishings from establishments that are significant in the history of Ann Arbor. Additional information



Digitally Divided: The Art of Algorithmic (In)Decision

Katherine Behar
March 20, 2019 | 3:00pm to 4:00pm
Ehrlicher Room, 3100 North Quad
Additional information

In “Digitally Divided,” Behar presents her artwork with a focus on how algorithms dismantle and rearrange us. Across culture, algorithms have been unleashed to allocate complex systems into manageable portions. They mete out standardization and suppress idiosyncrasy across diverse and defiant populations of human and nonhuman objects, in ways that are socially, technically, and conceptually reductive. This lecture brings together examples of Behar’s videos, interactive installations, sculptures, and performances, alongside episodes from media history and popular culture to explore this core notion of being “digitally divided.”

Less Metrics, More Rando: (Net) Art as Software Research

Ben Grosser
March 27, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

How are numbers on Facebook changing what we "like" and who we "friend"? Why does a bit of nonsense sent via email scare both your mom and the NSA? What makes someone mad when they learn Google can't see where they stand? From net art to robotics to supercuts to e-lit, Ben Grosser will discuss several artworks that illustrate his methods for investigating the culture of software.

Old, Raw, or New: A (New?) Deal for the Digital Age

Joy Lisi Rankin
April 11, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

American historians debate whether Franklin Delano Roosevelt’s Depression-era legislation was, in fact, a New Deal, or perhaps an “Old Deal” or a “Raw Deal.” Considering multiple perspectives and voices, combined with the long sweep of history, stokes this lively, ongoing debate. In this CRITICALxDESIGN talk, I’ll turn my attention to American computing in the 1960s and 1970s to consider whether the academic networks of that era may be inspiration for a Digital New Deal. The users of 1960s and 1970s academic computing networks built, accessed, and participated in cooperative digital commons, developing now-quotidian practices of personal computing and social media. In the process, they became what I call “computing citizens.” I’ll use several case studies to illustrate the dynamic - and unexpected - relationships among gender, community, computing, and citizenship, including the Old Deals and the Raw Deals of computing citizenship. How might these computing citizens inform crucial contemporary debates about technology and justice?

Apparatuses of recognition: Google, Project Maven, and targeted killing

Lucy Suchman
April 19, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

In June of 2018, following a campaign initiated by activist employees within the company, Google announced its intention not to renew a US Defense Department contract for Project Maven, an initiative to automate the identification of military targets based on drone video footage. Defendants of the program argued that that it would increase the efficiency and effectiveness of US drone operations, not least by enabling more accurate recognition of those who are the program’s legitimate targets and, by implication, sparing the lives of noncombatants. But this promise begs a more fundamental question: What relations of reciprocal familiarity does recognition presuppose? And in the absence of those relations, what schemas of categorization inform our readings of the Other?

The focus of a growing body of scholarship, this question haunts not only US military operations but an expanding array of technologies of social sorting. Understood as apparatuses of recognition (Barad 2007: 171), Project Maven and the US program of targeted killing are implicated in perpetuating the very architectures of enmity that they take as their necessitating conditions. I close with some thoughts on how we might interrupt the workings of these apparatuses, in the service of wider movements for social justice.


Ethics & Politics of AI talks

Engaging Discourse and Justice in a Datafied World

Anna Lauren Hoffman
April 22, 2019 | 3:00pm to 4:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Values of fairness, antidiscrimination, and inclusion occupy a central place in the emerging ethics of data and algorithms. Their importance is underscored by the reality that data-intensive, algorithmically-mediated decision systems—as represented by artificial intelligence and machine learning (AI/ML)—can exacerbate existing (or generate new) injustices, worsening already problematic distributions of rights, opportunities, and wealth. At the same time, critics of certain “fair” or “inclusive” approaches to the design and implementation of these systems have illustrated their limits, pointing to problems with reductive or overly technical definitions of fairness or a general inability to appropriately address representative or dignitary harms.

In this talk, I extend these critiques by focusing on problems of cultural and discursive violence. I begin by discussing trends in AI/ML fairness and inclusion discussion that mirror problematic tendencies from legal antidiscrimination discourses. From there, I introduce “data violence” as a response to these trends. In particular, I lay out the discursive bases of data-based violence—that is, the discursive forms by which competing voices and various “fair” or “inclusive” solutions become legible (and others marginalized or ignored). In doing so, I undermine any neat or easy distinction between the presence of violence and its absence—rather, our sense of fair or inclusive conditions contain and feed the possibility of violent ones. I conclude by echoing feminist political philosopher Serene Khader’s call to move away from justice-imposing solutions toward justice-enhancing ones. Importantly, justice-enhancing efforts cannot simply be a matter of protecting or “including” vulnerable others, but must also attend to discourses and norms that generate asymmetrical vulnerabilities to violence in the first place.

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media

Tarleton Gillespie
April 25, 2019 | noon to 1:00pm
Ehrlicher Room, 3100 North Quad
Additional information

Content moderation can serve as a prism for examining what platforms are, and how they subtly torque public life. Our understanding of platforms too blithely accepted the terms in which they were sold and celebrated - open, impartial, connective, progressive, transformative - skewing our study of social behavior that happens on them, stunting our examination of their societal impact.

Content moderation doesn’t fit this celebratory vision. As such, it has often been treated as peripheral to what they do—a custodial task, like sweeping up, occasional and invisible. What if moderation is in fact central to what platforms do? Moderation is an enormous part of the work of running a platform, in terms of people, time, and cost. The work of policing all this caustic content and abuse haunts platforms, and profoundly shapes how they work.

Today, social media platforms are being scrutinized in the press; specific controversies, each a tiny crisis of trust, have gelled into a more profound interrogation of their responsibilities to users and society. What are the implications of the emerging demand that platforms serve not as conduits or arbiters, but as custodians? This is uncharted territory for the platforms, a very different notion of how they should earn the trust of their users and stand accountable to civil society.