Mandeep Singh
๐ค SpeakerAppearances Over Time
Podcast Appearances
So you have to ask yourself, how are they going to source the power?
And that's where I think some of that inflation when it comes to sourcing that power is going to show up in this form.
The trend is more towards specialization.
What Anthropic has shown is you can focus on a particular area and really improve the model to the point that you can gain a lead.
So that's where I think OpenAI will do well, as will Gemini, Grok, and all these frontier models.
At the CES, Amazon launched Alexa Plus.
So you know there's a lot going on with agentic commerce and voice, and Apple has to obviously step up in terms of
whichever LLM they want to use.
And to my mind, you know, Google is the most obvious choice.
They seem to be confident about their own model, which has so far trailed the likes of OpenAI, Anthropic, and Gemini in terms of capabilities.
But it sounds like they want to make sure they have the capacity to deploy AI, and that's where nuclear is an interesting choice.
Yeah, look, I mean, it's hard to pinpoint exactly what portion of that 50 billion NVIDIA can capture through H200s, but there is no doubt that, you know, the frontier LLM companies, you know, from DeepSeek to, you know, the Alibaba, Quen, and Kimi, all these models have been trained and they have kept up
in terms of functionality with the frontier models here, whether it's Gemini or OpenAI.
And so from that perspective, you have to ask yourself, how have these companies trained their models, and is it all based on their in-house or Huawei chips?
And the answer, it's hard to discern sitting here, but
To my mind, they would welcome any opportunity to get a big NVIDIA cluster because at the end of the day, when it comes to training, NVIDIA is proven to be the one chip company that is the most useful for building the big training clusters.
Yes, we have the TPU news and all that, but everyone universally wants to train their models on NVIDIA.
And so from that perspective, H200, just on the training side, could be a pretty sizable $25 to $30 billion option.
next year.
Yeah, and that's where, you know, the continuity is the main point because what NVIDIA gives you is that backward compatibility.