Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

AI at Work

Inside LaunchDarkly’s Mission to Make AI Safer for Software Delivery

29 Oct 2025

Description

In this episode of AI at Work, I sit down with Tom Totenberg, Head of Release Automation and Observability at LaunchDarkly, to explore what happens when artificial intelligence starts writing and shipping our software faster than humans can think. Tom brings a rare blend of technical insight and grounded realism to one of the most important conversations in modern software development: how to balance speed, safety, and responsibility in an AI-driven world.We discuss the hidden risks of AI-fuelled shortcuts in software delivery and why over-reliance on AI-generated code can create dangerous blind spots. Tom explains how observability and real-time monitoring are becoming essential to maintaining trust and stability as teams adopt AI across the full development lifecycle. Drawing on LaunchDarkly’s recent investments into observability, he breaks down how guarded releases and real-time metrics are helping teams catch problems before users ever notice.From the dangers of “vibe coding” to the rise of agentic AI in software pipelines, Tom shares why AI should be seen as an amplifier rather than a magic fix. He also offers practical advice for leaders trying to balance innovation with caution, reminding us that the goal is to innovate with intention — to measure what matters and build resilience through feedback and transparency.Recorded during his time in New York, this episode captures both the human and technical sides of what it means to deliver software in an era where the line between automation and accountability is being redrawn.

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.