When MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislatio... Read allWhen MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislation against bias in algorithms that impact us all.When MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislation against bias in algorithms that impact us all.
- Awards
- 3 wins & 6 nominations total
- Self - Author, Weapons of Math Destruction
- (as Cathy O'Neil Ph.D.)
- Self - Author, Twitter and Tear Gas
- (as Zeynep Tufekci Ph.D.)
- Self - Author, Automating Inequality
- (as Virginia Eubanks Ph.D.)
- Self - Technical Co-Lead, Ethical A.I. Team at Google
- (as Timnit Gebru Ph.D.)
- Self - Author, Algorithms of Oppression
- (as Safiya Umoja Noble Ph.D.)
Featured reviews
The second argument of what if our government becomes like China is flawed as well. The face recognition AIs are going going to get better even if we do not work on them here someone will. Anything useful can also be used as a weapon. So if the government does want to use face recognition they will just get it. Probably better to have a known working system than one bought hasily and rushed into place.
It is odd to they barely mention any AI or ML outside of face recognition despite face recognition being a small part of what is out there.
All in all might be good to get you to started on research of your own but mostly misdirected furry.
The execution of this documentary, however, is very underwhelming, to say the least. There are the usuals: catchy montages, TED-style interviews, news soundbites, and the most annoying of all - artificially created (pun intended) graphics of AI scanning data in a stereotypical digital font paired with silly sound effects which, unless the primary audience of this documentary is fifth graders, I don't understand why it's necessary to incessantly rehash them. And then there's the unimaginative 'robotic voice.' It's just puerile.
Maybe the producers are wary that people still won't get the danger of unregulated AI without these gimmicks. But I'd argue that people would be more alarmed to learn how AI has been infiltrating and affecting our lives in the least expected ways. If the documentary can clearly point out the potential harms as a consequence, I think people will naturally find the lack of regulation disturbing, no silly visuals and sound effects are needed. Sometimes I think they actually undermine the severity of potential danger at hand. For example, the scene where a teenager is mistakenly stopped by plainclothes police, instead of being accompanied with yet another piece of cheesy soundtrack meant to suggest danger, it would be so much more powerful if everything is just eerily silent.
And the interviews and info - yes, AI is like a black box even to the programmers, but can you explain it in layman's terms so that people get it? - could be a lot more insightful. Even some short Vox-style Youtube clips have explored these issues in greater depth.
The themes explored are a bit all over the place too. I get it this domain is relatively new, so the vocabulary and focus aren't that streamlined yet, still... Sometimes the documentary brings up issues of obvious biases, which is consistent with the title, but sometimes we don't even know what the problem is, it's simply an issue of things being completely nontransparent and/or unverified by a third party. The China parts are also a little disjointed from the rest of the documentary and the country itself is painted in broad strokes - it's as if we can't do good until we can identify the bad guy to feel good about ourselves.
Fact: The darker ones complexation is the LESS likely that there is usable video or photos for investigators or prosecutors.
The makers of this film claim the opposite is the case, they claim there is a bias against persons with darker complexions -- when in fact that is not at all what the peer reviewed research shows.
Did you know
- Quotes
Self - Author, Weapons of Math Destruction: On internet advertising as data scientists, we are competing for eyeballs on one hand, but really we're competing for eyeballs of rich people. And then, the poor people, who's competing for their eyeballs? Predatory industries. So payday lenders, or for-profit colleges, or Caesars Palace. Like, really predatory crap.
- ConnectionsFeatured in Jeremy Vine: Episode #4.95 (2021)
- How long is Coded Bias?Powered by Alexa
Details
- Release date
- Countries of origin
- Official sites
- Language
- Also known as
- Kodlanmış Önyargı
- Filming locations
- Production companies
- See more company credits at IMDbPro
Box office
- Gross US & Canada
- $10,236
- Opening weekend US & Canada
- $10,236
- Nov 15, 2020
- Gross worldwide
- $10,236
- Runtime1 hour 26 minutes
- Color