On April 9th, Rene Haas, CEO of Arm Holdings, a British semiconductor and software design company came out and made a statement about data center energy consumption that most people would find shocking. He said, "by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that's probably 4% or less." Everyone wants to talk about AI, but this reality is something we don't discuss nearly enough. AI may be the greatest unrecognized threat to the environment today, because AI is an energy hog. Example: It requires nearly 10 times as much energy to do an Internet search in ChatGPT as using Google. Are the added benefits or the improved experience worth it? What about at scale? In this episode of the Art of Supply podcast, Kelly Barner takes an honest look at the very real problem of AI-driven energy consumption: Why AI requires so much energy to operate Projections for the growth of AI usage and therefore AI energy consumption How the use of AI should change given today's sensibilities about sustainability Links: Kelly Barner on LinkedIn Art of Supply LinkedIn newsletter Art of Supply on AOP Subscribe to This Week in Procurement
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other recent transcribed episodes
Transcribed and ready to explore now
#2426 - Cameron Hanes & Adam Greentree
16 Dec 2025
The Joe Rogan Experience
#487 – Irving Finkel: Deciphering Secrets of Ancient Civilizations & Flood Myths
12 Dec 2025
Lex Fridman Podcast
#2425 - Ethan Hawke
11 Dec 2025
The Joe Rogan Experience
SpaceX Said to Pursue 2026 IPO
10 Dec 2025
Bloomberg Tech
Don’t Call It a Comeback
10 Dec 2025
Motley Fool Money
Japan Claims AGI, Pentagon Adopts Gemini, and MIT Designs New Medicines
10 Dec 2025
The Daily AI Show