Beyond the Rhetoric of Algorithmic Solutionism
If
you ever hear that implementing algorithmic decision-making tools to
enable social services or other high stakes government decision-making
will increase efficiency or reduce the cost to taxpayers, know that
you’re being lied to. When implemented ethically, these systems cost more. And they should.
Whether
we’re talking about judicial decision making (e.g., “risk assessment
scoring”) or modeling who is at risk for homelessness, algorithmic
systems don’t simply cost money to implement.
They cost money to maintain. They cost money to audit. They cost money
to evolve with the domain that they’re designed to serve. They cost
money to train their users to use the data responsibly. Above all, they
make visible the brutal pain points and root causes in existing systems
that require an increase of services.
Otherwise,
all that these systems are doing is helping divert taxpayer money from
direct services, to lining the pockets of for-profit entities under the
illusion of helping people. Worse, they’re helping usher in a diversion
of liability because time and time again, those in powerful positions
blame the algorithms.
This
doesn’t mean that these tools can’t be used responsibly. They can. And
they should. The insights that large-scale data analysis can offer is
inspiring. The opportunity to help people by understanding the complex
interplay of contextual information is invigorating. Any social
scientist with a heart desperately wants to understand how to relieve
inequality and create a more fair and equitable system. So of course
there’s a desire to jump in and try to make sense of the data out there
to make a difference in people’s lives. But to treat data analysis as a
savior to a broken system is woefully naive.
Doing
so obfuscates the financial incentives of those who are building these
services, the deterministic rhetoric that they use to justify their
implementation, the opacity that results from having non-technical
actors try to understand technical jiu-jitsu, and the stark reality of
how technology is used as a political bludgeoning tool. Even more
frustratingly, what data analysis does well is open up opportunities for
experimentation and deeper exploration. But in a zero-sum context, that
means that the resources to do something
about the information that is learned is siphoned off to the
technology. And, worse, because the technology is supposed to save
money, there is no budget for using that data to actually help people.
Instead, technology becomes a mirage. Not because the technology is
inherently bad, but because of how it is deployed and used.
Next week, a new book that shows the true cost of these systems is being published. Virginia Eubanks’ book “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor”
is a deeply researched accounting of how algorithmic tools are
integrated into services for welfare, homelessness, and child
protection. Eubanks goes deep with the people and families who are
targets of these systems, telling their stories and experiences in rich
detail. Further, drawing on interviews with social services clients and
service providers alongside the information provided by technology
vendors and government officials, Eubanks offers a clear portrait of just how algorithmic systems actually play out on the ground, despite all of the hope that goes into their implementation.
Eubanks
eschews the term “ethnography” because she argues that this book is
immersive journalism, not ethnography. Yet, from my perspective as a
scholar and a reader, this is the best ethnography I’ve read in years. “Automating Inequality”
does exactly what a good ethnography should do — it offers a compelling
account of the cultural logics surrounding a particular dynamic, and
invites the reader to truly grok what’s at stake through the eyes of a
diverse array of relevant people. Eubanks brings you into the world of
technologically mediated social services and helps you see what this
really looks like on the ground. She showcases the frustration and
anxiety that these implementations produce; the ways in which both
social services recipients and
taxpayers are screwed by the false promises of these technologies. She
makes visible the politics and the stakes, the costs and the hope. Above
all, she brings the reader into the stark and troubling reality of what
it really means to be poor in America today.
“Automating Inequality” is on par with Barbara Ehrenreich’s “Nickel and Dimed” or Matthew Desmond’s “Evicted.”
It’s rigorously researched, phenomenally accessible, and utterly
humbling. While there are a lot of important books that touch on the
costs and consequences of technology through case studies and
well-reasoned logic, this book is the first one that I’ve read that
really pulls you into the world of algorithmic decision-making and
inequality, like a good ethnography should.
I
don’t know how Eubanks chose her title, but one of the subtle things
about her choice is that she’s (unintentionally?) offering a fantastic
backronym for AI. Rather than thinking of AI as “artificial
intelligence,” Eubanks effectively builds the case for how we should
think that AI often means “automating inequality” in practice.
This
book should be mandatory for anyone who works in social services,
government, or the technology sector because it forces you to really
think about what algorithmic decision-making tools are doing to our
public sector, and the costs that this has on the people that are
supposedly being served. It’s also essential reading for taxpayers and
voters who need to understand why technology is not the panacea that
it’s often purported to be. Or rather, how capitalizing on the benefits
of technology will require serious investment and a deep commitment to
improving the quality of social services, rather than a tax cut.
Please please please read this book. It’s too important not to.
Data
& Society will also be hosting Virginia Eubanks to talk about her
book on January 17th at 4PM ET. She will be in conversation with Julia
Angwin and Alondra Nelson. The event is sold out, but it will be livestreamed online. Please feel free to join us there!