Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

Two Voice Devs

Episode 228 - AI Ethics: How Developers Can Build Fairer Systems

20 Feb 2025

Description

Are you building AI models and systems? Then you need to understand AI ethics! In this episode of Two Voice Devs, Allen Firstenberg welcomes Parul, a Senior Production Engineer at Meta, to dive deep into the world of AI ethics. Learn why fairness and bias are critical considerations for developers, and discover practical techniques to mitigate bias in your AI systems.Parul shares her experiences and passion for AI ethics, detailing how biases in training data and system design can lead to unfair or even harmful outcomes. This episode provides concrete examples, actionable advice, and valuable resources for developers who want to build more ethical and equitable AI.More Info:* Fairlearn: https://fairlearn.org/* AIF360: https://aif360.readthedocs.io/en/stable/* what-if tool: https://pair-code.github.io/what-if-tool/Timestamps:00:00:00 Introduction00:00:20 Guest Introduction: Parul, Meta00:02:22 What is AI Ethics?00:06:13 Why is AI Ethics Important?00:08:15 AI Systems vs. AI Models00:09:52 Examples of Bias in AI Systems00:12:23 Minimizing Biases: Developer Responsibility00:14:53 Tips for Minimizing Unfairness and Biases00:19:40 Fairness Constraints: Demographic Parity00:23:17 The Bigger Picture: Roles & Responsibilities00:29:23 Monitoring: Bias Benchmarks00:32:00 Open Source Frameworks for AI Ethics00:34:02 Call to Action & Closing#AIethics #Fairness #Bias #MachineLearning #ArtificialIntelligence #Developers #OpenSource #EthicalAI #TwoVoiceDevs #TechPodcast #DataScience #AIdevelopment

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.