Skip to content
Longterm Wiki

Algorithmic Justice League

Organization
Founded 2016 (10 years old)HQ: Cambridge, MAajl.org

Key People

2
JB
Joy BuolamwiniFounder
Founder & Executive Director
2016 – present
Founded AJL in 2016 as a graduate student at MIT Media Lab. Serves as Executive Director. Known for Gender Shades research on facial recognition bias. Confirmed on AJL about page as of 2026-04-03.
SC
Sasha Costanza-Chock
Research Director
Research Director at AJL. Also MIT professor. Led the CRASH Project beginning in 2019. Confirmed on AJL about page and multiple sources as of 2026-04-03.

Funding History

1
Rockefeller Foundation Grant - Algorithmic Vulnerability BountygrantJul 2020
$150KLed by Rockefeller Foundation
Grant period 07.01.2020 - 11.30.2021 for Algorithmic Vulnerability Bounty, a participatory platform for identifying and addressing AI algorithmic harms
rockefellerfoundation.org

All Facts

Organization

Founded Date2016view →
HeadquartersCambridge, MAview →

General

Websitehttps://www.ajl.orgview →

Other

advocacyDr. Joy Buolamwini testified before the House Committee on Science, Space, and Technology on AI and facial recognition bias2019view →
documentaryCoded Bias (2020), directed by Shalini Kantayya, premiered at Sundance Film Festival. 90 minutes. Released on Netflix April 5, 2021 and broadcast on PBS Independent Lens. 100% Rotten Tomatoes score (52 reviews, 7.9/10 average). Emmy-nominated for Outstanding Science/Technology Documentary.2021view →
funding-sourcesFunded by Ford Foundation, MacArthur Foundation, Alfred P. Sloan Foundation, Rockefeller Foundation, Mozilla Foundation, and individual donors2025view →
leadership-recognitionDr. Joy Buolamwini elected to NAACP Legal Defense Fund Board2025view →
projectFreedom Flyers Campaign investigating TSA facial recognition at 250+ US airports; published 'Comply To Fly?' report20252 pts
As OfValueLink
2025Freedom Flyers Campaign investigating TSA facial recognition at 250+ US airports; published 'Comply To Fly?' reportview →
Jul 2020CRASH Project (Community Reporting of Algorithmic System Harms) — bug-bounty programs for algorithmic bias, launched July 2020view →
publicationGender Shades study (2018) by Buolamwini and Gebru demonstrated intersectional accuracy disparities in commercial facial recognition from IBM, Microsoft, and Face++. Dark-skinned women had highest error rates; light-skinned men had lowest. Over 4,900 citations on Semantic Scholar.Mar 2026view →
recognitionNamed one of Fast Company's 10 Most Innovative AI Companies in the World2021view →
research-impactOver 70 AI researchers publicly signed letter defending AJL's research following Amazon's challenge to the Gender Shades findings on their Rekognition system2019view →

Publications

2
TitlePublicationTypeAuthorsUrlVenuePublishedDateIsFlagshipSourceSource check
Comply to Fly?reportAlgorithmic Justice Leagueajl.orgAJL2025-07ajl.orgNot checked
Comply to Fly: Face Recognition at AirportsreportAlgorithmic Justice Leagueajl.org2024ajl.orgNot checked
Internal Metadata
ID: sid_UyAH46sPFN
Stable ID: sid_UyAH46sPFN
Wiki ID: E2880
Type: organization
YAML Source: packages/factbase/data/fb-entities/algorithmic-justice-league.yaml
Facts: 12 structured
Records: 5 in 3 collections