Decoder Host
๐ค SpeakerAppearances Over Time
Podcast Appearances
Jan LeCun, who has met as chief AI scientist and often one of the first people in the industry to call bullshit on popular narratives, says the deep-seek freakout is, quote, woefully unjustified and based on, quote, major misunderstandings about AI infrastructure investments.
He pointed out that when AI companies and tech companies today say they're going to spend billions of dollars, a good chunk of that money is going towards inference or using the models and not just training them. And total inference costs are only going to increase as multimodal AI with video and audio becomes more popular.
He pointed out that when AI companies and tech companies today say they're going to spend billions of dollars, a good chunk of that money is going towards inference or using the models and not just training them. And total inference costs are only going to increase as multimodal AI with video and audio becomes more popular.
He pointed out that when AI companies and tech companies today say they're going to spend billions of dollars, a good chunk of that money is going towards inference or using the models and not just training them. And total inference costs are only going to increase as multimodal AI with video and audio becomes more popular.
And no matter what, maintaining those models once they've been trained and serving them to millions of customers costs quite a lot. Assuming that just because you can train a model cheaply doesn't mean you can run inference on it at the scale OpenAI, Google, and Anthropic do. In other words, we might need those chips and data centers after all.
And no matter what, maintaining those models once they've been trained and serving them to millions of customers costs quite a lot. Assuming that just because you can train a model cheaply doesn't mean you can run inference on it at the scale OpenAI, Google, and Anthropic do. In other words, we might need those chips and data centers after all.
And no matter what, maintaining those models once they've been trained and serving them to millions of customers costs quite a lot. Assuming that just because you can train a model cheaply doesn't mean you can run inference on it at the scale OpenAI, Google, and Anthropic do. In other words, we might need those chips and data centers after all.
And if all that compute isn't being used for AI, well, it might get used for something else.
And if all that compute isn't being used for AI, well, it might get used for something else.
And if all that compute isn't being used for AI, well, it might get used for something else.
But that doesn't mean it's smooth sailing for OpenAI and Stargate. The timing of DeepSeat going viral almost immediately after the Stargate announcement has thrown some serious cold water on OpenAI's claims that it needs a half-trillion-dollar data center project to stay competitive.
But that doesn't mean it's smooth sailing for OpenAI and Stargate. The timing of DeepSeat going viral almost immediately after the Stargate announcement has thrown some serious cold water on OpenAI's claims that it needs a half-trillion-dollar data center project to stay competitive.
But that doesn't mean it's smooth sailing for OpenAI and Stargate. The timing of DeepSeat going viral almost immediately after the Stargate announcement has thrown some serious cold water on OpenAI's claims that it needs a half-trillion-dollar data center project to stay competitive.
I asked Kylie about this specifically and about whether OpenAI, with its constant concerns about cash flow, can really pull this off. And most importantly, whether they have anywhere near the amount of money they're claiming to need.
I asked Kylie about this specifically and about whether OpenAI, with its constant concerns about cash flow, can really pull this off. And most importantly, whether they have anywhere near the amount of money they're claiming to need.
I asked Kylie about this specifically and about whether OpenAI, with its constant concerns about cash flow, can really pull this off. And most importantly, whether they have anywhere near the amount of money they're claiming to need.
On Tuesday, after a full day of utter chaos in the world of AI and more than a trillion dollar wipeout in tech stocks, Sam Altman posted an unassuming selfie with Microsoft CEO Satya Nadella on his ex-account. It was the kind of post that I now think of as classic Altman. Short, ambiguous, easy to confirm any assumption you already had.
On Tuesday, after a full day of utter chaos in the world of AI and more than a trillion dollar wipeout in tech stocks, Sam Altman posted an unassuming selfie with Microsoft CEO Satya Nadella on his ex-account. It was the kind of post that I now think of as classic Altman. Short, ambiguous, easy to confirm any assumption you already had.
On Tuesday, after a full day of utter chaos in the world of AI and more than a trillion dollar wipeout in tech stocks, Sam Altman posted an unassuming selfie with Microsoft CEO Satya Nadella on his ex-account. It was the kind of post that I now think of as classic Altman. Short, ambiguous, easy to confirm any assumption you already had.
Altman captioned the post, quote, the next phase of the Microsoft open AI partnership is going to be much better than anyone is ready for. He included two exclamation points just in case.