Reading Room

Want to read more about the cultural implications of algorithms — from YouTube search to Galileo? Wondering what algorithmic bias is and how people are trying to grapple with it? Here’s an assortment of links that may be of interest:

Academic and professional publications

David Danks & Alex London: Algorithmic Bias in Autonomous Systems
Sophie Bishop: Anxiety, panic and self-optimization: Inequalities and the YouTube algorithm
Nick Seaver: Algorithms as Culture: Some tactics for the ethnography of algorithmic systems
Megan Garcia: Racist in the Machine: The Disturbing Implications of Algorithmic Bias
Solon Barocas and Andrew D. Selbst: California Law Review: Big Data’s Disparate Impact
Dan McQuillan: Data Science as Machinic Neoplatonism
Keith Kirkpatrick: Battling Algorithmic Bias
Kate Crawford and Trevor Paglen: Excavating AI
Safiya Umoja Noble: Algorithms of Oppression — How Search Engines Reinforce Racism
Stepan Komkov & Aleksandr Petiushko. AdvHat: Real-world adversarial attack on ArcFace Face ID system
MIT AI Ethics Education Curriculum (Middle school students consider algorithmic bias and propose YouTube redesigns.)
Amy Alexander: SVEN (Surveillance Video Entertainment Network): Looking Back and Forward
The Algorithm is the Message: What the Robot Saw.

Non-academic publications

Anouk Vleugels, The Next Web: Want AI to be less biased? Cherish your female programmers
Paul Mozur, NYTimes: One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority
Zeynep Tufekci, NYTimes: YouTube, the Great Radicalizer
Kelly Weill, Daily Beast: YouTube Tweaks Algorithm to Fight 9/11 Truthers, Flat Earthers, Miracle Cures
Tom Simonite, Wired: How Coders Are Fighting Bias in Facial Recognition Software
Raymond Biesinger, Politico: Is your Software Racist?
Jackie Snow, MIT Technology Review: Bias already exists in search engine results, and it’s only going to get worse
James Vincent, The Verge: Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech
Tom Simonite, Wired: Machines Learn a Sexist View of Women
Sharon Begley, Stat News: Racial bias skews algorithms widely used to guide care from heart surgery to birth, study finds
Guillaume Chaslot
Joy Buolamwini

Here’s the famous, but controversial, ProPublica study about Northpointe’s recidivism prediction software (used to figure out who gets out on bail, probation, etc).