Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

AI: Voice or Victim?

Bias in, Bias Out: What AI Gets Wrong with Women with Siri Swahari (CGI)

26 Nov 2025

Description

When a group of powerhouse women at the MIT AI-Powered Women Conference asked ChatGPT one simple question, “What do you think I look like?”  the results were… revealing. And not in a good way.In this live episode from Boston, we sit down with Siri Swahari, Vice President & Partner at CGI, to unpack what happened that night, why AI keeps defaulting to the same stale stereotypes, and how leaders can push for systems that are actually fair, representative, and useful.Siri breaks down:The viral moment when AI generated bald white dudes for women of colorWhy the problem isn’t “men vs women”… it’s the dataWhat organizations must fix before unleashing AI tools on their teamsHow bias sneaks into models — even when intentions are goodWhy representation, guardrails, and real stress-testing matterWhat leaders should ask before adopting any AI solutionWhy she’s cautious about AI’s pace — but still optimisticAnd her work building The Women Executive Circle, a growing cross-state community supporting senior women leadersIf you work in HR, ops, tech, policy, or anywhere AI touches people — this episode will hit home.CTA👉 Watch the full episode on YouTube: https://youtu.be/4D78uhi-rw0

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.