Megan McCarty Carino
๐ค SpeakerAppearances Over Time
Podcast Appearances
Do you know your TPUs from your GPUs?
From American Public Media, this is Marketplace Tech.
I'm Megan McCarty Carino.
GPUs, or graphics processing units, have become the most important commodity in the AI boom and made Nvidia a multi-trillion dollar company.
But they could have competition from a different three-letter chip, the TPU, or Tensor Processing Unit.
These are developed by Google specifically for AI workloads.
Anthropic, OpenAI, and Meta have reportedly made deals for Google TPUs.
For more on what this means, we've got Christopher Miller, historian at Tufts and author of the book Chip War.
Yeah, I mean, what kinds of advantages do TPUs have over GPUs for these specific use cases?
And whenever we talk about AI processing, we often sort of break it down into training versus inference.
Training is kind of the most processing heavy part where, you know, you're just you're processing these vast amounts of data for the machine learning.
And then inference is kind of slightly less processing heavy.
Say you're using a chatbot and ask it a question.
Are these primarily for one or the other or both of those parts of the AI process?
Right, and perhaps one of those specialized hardware might be neural processing units.
These have been around for a while, as many AI applications have been around for a while.
But as this particular type of processing-heavy AI becomes more common on devices, are neural processing units on our devices becoming a bigger focus?
We'll be right back.
You're listening to Marketplace Tech.
I'm Megan McCarty Carino.