Skip to content
Longterm Wiki
Back

Coded Bias - Wikipedia

reference

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: Wikipedia

Wikipedia article about 'Coded Bias,' a 2020 documentary exploring algorithmic bias in facial recognition and AI systems, relevant to AI safety discussions around fairness, discrimination, and the societal harms of unregulated AI deployment.

Metadata

Importance: 45/100wiki pagereference

Summary

Coded Bias is a 2020 documentary directed by Shalini Kantayya, centered on MIT researcher Joy Buolamwini's discovery that facial recognition systems fail to recognize darker-skinned faces. The film examines how AI algorithms embed racial and gender biases affecting housing, healthcare, hiring, and policing. It advocates for legal frameworks to govern AI and highlights the work of the Algorithmic Justice League.

Key Points

  • Joy Buolamwini discovered facial recognition systems failed to detect her face unless she wore a white mask, revealing racial bias in AI.
  • The documentary argues AI algorithms discriminate by race and gender in domains like housing, healthcare, credit, education, and criminal justice.
  • It highlights the lack of legal structures governing AI, framing unchecked algorithmic decision-making as a human rights issue.
  • Contributors include prominent AI ethics researchers: Timnit Gebru, Cathy O'Neil, Virginia Eubanks, Zeynep Tufekci, and Safiya Noble.
  • Buolamwini testified before US Congress and founded the Algorithmic Justice League as a result of her research.

1 FactBase fact citing this source

Cached Content Preview

HTTP 200Fetched Apr 7, 202612 KB
Coded Bias - Wikipedia 

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 Jump to content 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 From Wikipedia, the free encyclopedia 
 
 
 
 
 
 2020 American documentary film 
 

 Coded Bias Film poster Directed by Shalini Kantayya Produced by Shalini Kantayya Production
company 7th Empire Media Release dates 
 January 2020  ( 2020-01 )  (Sundance)

 November 11, 2020  ( 2020-11-11 ) 
 
 Running time 90 minutes Country United States Language English 
 Coded Bias is an American documentary film directed by Shalini Kantayya that premiered at the 2020 Sundance Film Festival . [ 1 ] The film includes contributions from researchers Joy Buolamwini , Deborah Raji , Meredith Broussard , Cathy O’Neil , Zeynep Tufekci , Safiya Noble , Timnit Gebru , Virginia Eubanks , and Silkie Carlo , and others. [ 2 ] 

 
 Background

 [ edit ] 
 Kantayya previously directed a documentary titled Catching the Sun and also directed one episode of the National Geographic television series, Breakthrough . [ 3 ] [ 4 ] She is also an associate of UC Berkeley Graduate School of Journalism . [ 5 ] Kantayya said an interview with 500 Global on August 17, 2021, that three years previously she did not even know what an algorithm was. [ 6 ] She read the book Weapons of Math Destruction , which describes how artificial intelligence , machine learning , and algorithms can determine outcomes for certain people. She later came across the work of Joy Buolamwini through a Ted Talk .

 Summary

 [ edit ] 
 The documentary is about artificial intelligence and the biases that can be embedded into this technology. MIT media researcher Joy Buolamwini 's computer science studies uncovered that her face was unrecognizable in many facial recognition systems and she worked to find out why these systems failed. She later found that facial recognition programs only worked when she wore a white mask. She goes on to find out about how else artificial technology can affect minorities. [ 7 ] 

 Coded Bias says that there is a lack of legal structures for artificial intelligence, and that as a result, human rights are being violated. It says that some algorithms and artificial intelligence technologies discriminate by race and gender statuses in domains such as housing, career opportunities, healthcare, credit, education, and legalities. [ 8 ] Buolamwini and her colleagues were later asked to testify in front of the US Congress about artificial intelligence. Buolamwini subsequently created a digital advocacy group, the Algorithmic Justice League . [ 9 ] 

 The movie highlights how facial recognition systems can cause problems for vulnerable groups as due to bias within the code they do not recognize everyone equally or as equals. As companies use more machine learning, the algorithms discussed have substantial i

... (truncated, 12 KB total)
Resource ID: kb-f9c086438d74fc55