
MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.Want to help shape TED’s shows going forward? Fill out our survey!Become a TED Member today at ted.com/joinLearn more about TED Next at ted.com/futureyou Hosted on Acast. See acast.com/privacy for more information.
No persons identified in this episode.
No transcription available yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster