Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

Your Undivided Attention

AGI Beyond the Buzz: What Is It, and Are We Ready?

30 Apr 2025

Description

What does it really mean to ‘feel the AGI?’ Silicon Valley is racing toward AI systems that could soon match or surpass human intelligence. The implications for jobs, democracy, and our way of life are enormous.In this episode, Aza Raskin and Randy Fernando dive deep into what ‘feeling the AGI’ really means. They unpack why the surface-level debates about definitions of intelligence and capability timelines distract us from urgently needed conversations around governance, accountability, and societal readiness. Whether it's climate change, social polarization and loneliness, or toxic forever chemicals, humanity keeps creating outcomes that nobody wants because we haven't yet built the tools or incentives needed to steer powerful technologies.As the AGI wave draws closer, it's critical we upgrade our governance and shift our incentives now, before it crashes on shore. Are we capable of aligning powerful AI systems with human values? Can we overcome geopolitical competition and corporate incentives that prioritize speed over safety?Join Aza and Randy as they explore the urgent questions and choices facing humanity in the age of AGI, and discuss what we must do today to secure a future we actually want.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_ and subscribe to our Substack.RECOMMENDED MEDIADaniel Kokotajlo et al’s “AI 2027” paperA demo of Omni Human One, referenced by RandyA paper from Redwood Research and Anthropic that found an AI was willing to lie to preserve it’s valuesA paper from Palisades Research that found an AI would cheat in order to winThe treaty that banned blinding laser weaponsFurther reading on the moratorium on germline editing RECOMMENDED YUA EPISODESThe Self-Preserving Machine: Why AI Learns to DeceiveBehind the DeepSeek Hype, AI is Learning to ReasonThe Tech-God Complex: Why We Need to be SkepticsThis Moment in AI: How We Got Here and Where We’re GoingHow to Think About AI Consciousness with Anil SethFormer OpenAI Engineer William Saunders on Silence, Safety, and the Right to WarnClarification: When Randy referenced a “$110 trillion game” as the target for AI companies, he was referring to the entire global economy.  Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.