Take ActionAboutSpotlightlibraryCONTACTresearchprojectstalks/EVENTSpolicy/advocacyexhibitionseducationPRESS
Follow AJL on TwitterFollow AJL on InstagramFollow AJL on Facebook
White mask from coded bias.
A SHALINI KANTAYYA film
Coded Bias

When MIT researcher, poet and computer scientist Joy Buolamwini uncovers racial and gender bias in AI systems sold by big tech companies, she embarks on a journey alongside pioneering women sounding the alarm about the dangers of unchecked artificial intelligence that impacts us all. Through Joy’s transformation from scientist to steadfast advocate and the stories of everyday people experiencing technical harms, Coded Bias sheds light on the threats A.I. poses to civil rights and democracy.
View screening details here.

WATCH THE TRAILER

help us unmask #CODEDBIAS

sign upsharelearndonateHost screening
PERSONAL STORIES

WHEN AI GOES WRONG, REAL LIVES ARE AT STAKE

Coded Bias weaves the personal stories of people whose lives have been directly impacted by unjust algorithms. You can make an impact by helping us spread the word about the film, hosting a screening, and/or entering your email below to join the movement towards equitable and accountable AI.

join the movement
Facial Recognition Technologies

Facial recognition tools sold by large tech companies including IBM, Microsoft, and Amazon, are racially and gender biased. They have even failed to correctly classify the faces of icons like Oprah Winfrey, Michelle Obama, and Serena Williams. Around the world, these tools are being deployed raising concerns of mass surveillance.

Employment

An award-winning and beloved teacher encounters injustice with an automated assessment tool, exposing the risk of relying on artificial intelligence to judge human excellence.

Housing

A building management company in Brooklyn plans to implement facial recognition software for tenants to use to enter their homes, but the community fights back.

Criminal justice

Despite working hard to contribute to society, a returning citizen finds her efforts in jeopardy due to law enforcement risk assessment tools. The criminal justice  system is already riddled with racial injustice and biased algorithms are accentuating this.

LET’S PUT A FACE TO THE HARMS OF AI

HAVE YOU WITNESSED UNJUST artificial intelligence IMPACTING YOU OR OTHERS?

Help us shed light on the impact of AI harms on civil rights and people’s lives around the world. You can share your story using the hashtag #CodedBias or send us a private message.

share your story
learn more

WHAT DOES IT MEAN WHEN AI INCREASINGLY GOVERNS OUR OPPORTUNITIES?

AI systems are increasingly infiltrating our lives, influencing who gets a job, which students get admitted to college, how cars navigate the roads, what medical treatment an individual receives, and even who we date. And while builders of AI systems aim to overcome human limitations, research studies and headlines continue to remind us that these systems come with risks of bias and abuse. AI reflects the coded gaze- our term for the priorities, preferences, and at times prejudices of those who have the power to shape the technology.

learn more
Press/media
Experts urged US Congress to regulate the use of facial recognition technology
Read More
GOING BEYOND SELF-REGULATION

FIRST FEDERAL LEGISLATION FOR FACIAL RECOGNITION

Coded Bias illuminates our mass misconceptions about AI and emphasizes the urgent need for legislative protection, and follows the Algorithmic Justice League’s journey to push for the first-ever legislation in the U.S to place limits to facial recognition technology. Support our work by sharing our advocacy and policy initiatives, or by making a donation so we can keep going.

make a donation

Join the Algorithmic Justice League Newsletter.

Stay up to date with the movement towards equitable and accountable AI.

SIGN UP
ABOUT THE FILM DIRECTOR

SHALINI KANTAYYA

Shalini Kantayya directed the season finale for the National Geographic series Breakthrough with executive producer Ron Howard. Her debut feature film, Catching the Sun, premiered at the Los Angeles Film Festival and was named a New York Times Critics’ Pick. The film released globally on Netflix on Earth Day 2016 with executive producer Leonardo DiCaprio and was nominated for the Environmental Media Association Award for Best Documentary. Kantayya is a TED fellow.

Sundance Logo
ADDITIONAL CREDITS

featured cast

@jovialjoy
@jovialjoy
Joy Buolamwini

Founder of the Algorithmic Justice League
‍@jovialjoy

@mathbabedotorg
@mathbabedotorg
Cathy O’Neil

Author of Weapons of Math Destruction
‍@mathbabedotorg

@merbroussard
@merbroussard
Meredith Broussard

Author of Artificial Unintelligence
‍@merbroussard

@safiyanoble
@safiyanoble
Safiya Noble

Author of Algorithms of Oppression
‍@safiyanoble

@zeynep
@zeynep
Zeynep Tufekci

Author of Twitter and Teargas
‍@zeynep

@amywebb
@amywebb
Amy Webb

Author of The Big Nine
‍@amywebb

Coded Bias is a documentary film you can’t afford to miss.”
- Marie Claire
AI is not neutral, and women are leading the charge to ensure our civil rights are protected.”
- Sundance Institute
A must-watch movie for all leaders in 2020."
- INC.

@AJLUNITED

FOLLOW US ON SOCIAL
TWITTER
FACEBOOK
LINKEDIN
YOUTUBE
View on twitter
View on twitter
View on instagram
View on instagram
View on instagram
View on twitter
FOLLOW US

#CodedBias #EquitableAI #AccountableAI #InclusiveAI #ResponsibleAI #EthicalAI #AIbias #AIharms #MachineBias #ArtificialIntelligence #InclusiveTech #AJL #AlgorithmicJusticeLeague

Navigate
  • Home
  • Take Action
  • About
  • Spotlight
  • Library
  • Learn MorePrivacy Policy
our library
  • Research
  • Projects
  • Talks/Events
  • Policy/Advocacy
  • Exhibitions
  • Education
  • Press
contact us
  • Get in Touch
  • Share Your Story
  • Journalists
  • Donate
  • Twitter
  • Instagram
  • Facebook
  • LinkedIn
©Algorithmic Justice League 2025
Powered by Casa Blue