Mandeep Singh
๐ค SpeakerAppearances Over Time
Podcast Appearances
And look, I mean, to my mind, they are going aggressive in terms of adding more capacity than they probably need because they think if they get market share, they get the companies or users to use their product, then they will be able to monetize and probably drive some companies out of, you know, competing with them because of the scale involved here.
I think right now XAI must be thinking, and they are doing a $20 billion deal with some private financing.
But look, when OpenAI announces a 10 gigawatt deal, we're talking 500 billion, not 20 billion anymore.
So the numbers are getting bigger and bigger.
Well, when I look at Mag7, your Broadcom is not in Mag7.
It's a $1.6 trillion company.
You know, OpenAI probably, you know, it's one of the- And they're up 10% because of this.
No, I mean, look, so right now their gross margins would be negative if you factor in the training costs.
Inferencing wise, yes, they are making some money.
But clearly, if you include everything and just to compare it with Google, Google has an annual cost of revenue of around $100 billion.
That powers all of their apps, you know, Google, YouTube, everything that they run.
OpenAI's compute costs are probably north of $20 billion right now.
If they're adding 26 gigawatts more capacity, we're talking compute costs to multiply at least 25-fold.
From that perspective, you have to ask yourself, how much incremental revenue do you want to see from OpenAI to justify this potentially $1 trillion in compute infrastructure spend?
And that's where Google's infrastructure is so efficient because just, you know, less than five gigawatt of compute gets you to over 400 billion in revenue.