Ryan Knudson
๐ค SpeakerAppearances Over Time
Podcast Appearances
What does that mean?
What is inference computing?
At its most basic, the training is the learning part, or when an AI model is devouring everything it can.
And the inferencing is the doing, the actual responding to prompts.
This decision is starting to pay off.
In October, AMD announced that massive deal with OpenAI to help it run inference functions for ChatGPT.
Under the deal, OpenAI agreed to buy a ton of AMD chips starting next year.
And in return, OpenAI could get as much as a 10% ownership stake in AMD.
Yet amid all the excitement and money, there's also concern about an AI bubble.
And the AMD deal with OpenAI raised questions among some investors.
After the break, Robbie sits down with Lisa Su to talk about the concerns about an AI bubble.
Recently, Robbie flew to a big AMD office in Austin to meet with Lisa Su.
One of the main things Robby wanted to talk with Sue about is the AI bubble, and specifically whether AMD's deal with OpenAI might contribute to that bubble.
Under the terms of the deal, AMD gave OpenAI a big financial incentive to use its chips.
If OpenAI hits certain milestones for deploying AMD's chips, it has the option to buy AMD shares at a steep discount, at just one cent per share.
โ Sue defended the deal in her conversation with Robbie.
Is she at all concerned about an AI bubble?
Sue says that once AI models are up and running and people start relying on them, there'll be almost no limit to the future demand for AI computing.
She estimates that the overall market for AI could be worth $1 trillion a year.
But isn't Sue, the head of an AI chips company, going to say that there's never-ending demand for chips, the thing I happen to be selling?