Friday, May 20, 2022

Review: "Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor" by Virginia Eubanks

Something about the digital world frightens people into believing they are facing a completely alien, Cthulian beast, as opposed to simply an online version of the usual suspects. Shoshana Zuboff makes this mistake in The Age of Surveillance Capitalism, suggesting the 'behavioural surplus' extracted by Facebook and its ilk fuels an economic system fundamentally different from the wholesome, warm and fuzzy capitalism of Henry Ford and company. I wrote in my review of that book that she failed to substantiate her argument:

Is Google hiding how much data it collects from you really all that different from Apple hiding the conditions of its manufacturing facilities? Is Facebook's attempts to manipulate your emotions or your sense of self-worth really a whole new beast or just another step in the advertising industry's development? Is the desire of surveillance capitalism companies to expand vertically and horizontally into new parts of our lives and into new parts of the world, to privatize or profit off public goods any different from the same expansion drive of any other company?

Virginia Eubank's Automating Inequality sees through the Silicon Valley smoke and mirrors, and instead correctly draws a direct line from the poorhouses of the 19th century, through the scientific charity and eugenics of the 20th century to the automated and algorithmic social systems of today. She coins the term "digital poorhouse", likening the publicly-funded facilities that granted wretched living conditions in exchange for grueling work to the systems of digital tracking and automated decision-making that govern distribution of public resources today.

Like the brick-and-mortar poorhouse, the digital poorhouse diverts the poor from public resources. Like scientific charity, it investigates, classifies, and criminalizes. Like the tools birthed during the backlash against welfare rights, it uses integrated databases to target, track, and punish.

She tracks three systems in particular: IBM's "modernization" of the welfare administration system in Indiana, the social sorting algorithm implemented for sheltering the unhoused in Los Angeles, and a model implemented in Pittsburgh to predict child harm. The chapters detailing these examples are compelling, and combine stories from social workers and people affected by these systems with data and perspectives from academics. They're also infuriating and saddening to read.

The final chapter, in which she ties together these stories with the cultural practices that enable them to exist (e.g. culture of individuality, middle class anxiety, racism) is excellent. Eubanks founds her critique of these systems in historical understanding of how these systems came to be.

Just as the county poorhouse was suited to the Industrial Revolution, and scientific charity was uniquely appropriate for the Progressive Era, the digital poorhouse is adapted to the particular circumstances of our time. The county poorhouse responded to middle-class fears about growing industrial unemployment: it kept discarded workers out of sight but nearby, in case their labor was needed. Scientific charity responded to native elites' fear of immigrants, African Americans, and poor whites by creating a hierarchy of worth that controlled access to both resources and social inclusion. Today, the digital poorhouse responds to what Barbara Ehrenreich has described as a "fear of failing" in the professional middle class.

I think because she is able to see the similarities between current technological solutions and social systems of the past, she is better able to identify the unique aspects of modern automation and algorithms. She concludes that the digital poorhouse is hard to understand, massively scalable, persistent over time, and is alienating in a particularly new way:

Containment in the physical institution of a poorhouse had the unintentional result of creating class solidarity across race, gender, and national origin. When we sit at a common table, we might see similarities in our experiences, even if we are forced to eat gruel. Surveillance and digital social sorting drive us apart as smaller and smaller microgroups are targeted for different kinds of aggression and control. When we inhabit an invisible poorhouse, we become more and more isolated, cut off from those around us, even if they share our suffering.

Working in data science, I think often about the ethical obligations of the profession. Sometimes I wish books like this one (along with Cathy O'Neil's Weapons of Math Destruction and Caroline Criado Perez's Invisible Women) were required reading. I'm under no illusion that professional certification or licensing of data science would solve the issue. Eubanks isn't, I think, the first to suggest a Hippocratic Oath for data science. Perhaps that would help with a culture shift.

I'll end with her two questions she asks people developing technological solutions that address poverty, because I think they're great:

  1. Does the tool increase the self-determination and agency of the poor?
  2. Would the tool be tolerated if it was targeted at non-poor people?

No comments:

Post a Comment