Seth Fiegeman
๐ค SpeakerAppearances Over Time
Podcast Appearances
That's right.
I mean, look, most people are probably more consumed by thinking about the models that we use in the chatbots, but they're talking about a lot of the key components that go into setting up and operating data centers from the data center server racks to cooling and power equipment to the cabling.
These are not the sexiest things, but they're essential if they're trying to build out a massive scale of data centers here and abroad.
Immense.
And we're not even mentioning what they feel from Google now, which seems to have built a model that either rivals or out-competes anything that opening eye currently is in the market, which gets back to, again, they want to have a competitive advantage, not just on the model side, but on the infrastructure that supports it.
And increasingly, they're tying themselves to all different corners of the market and all different key players in tech to make it so they're too big to fail.
You certainly ask interesting questions.
Look, we get it.
You're a discerning sort of person.
You're a Bloomberg listener after all.
I think that's right.
It really is telling about this moment for both companies.
OpenAI just is tapping every possible resource it can to meet its cloud computing needs.
And Amazon, which had previously been invested in Anthropic, is also trying to have a piece of OpenAI and possibly other players down the road.
So it contributes to that incestuous web we keep talking about.
Everyone is backing everyone else.
A bit more near-term than some other deals, we've seen OpenAI broker and OpenAI is leaving the door open to expanding with more investment down the road.
I do think it's telling, though, that this is Amazon providing NVIDIA chips, not Trinium, not in-house.
So on the one hand, it's a testament to Amazon's ability to build up cloud computing infrastructure at scale to meet OpenAI's needs, but you have to wonder what that means about the quality of Amazon's own chips and why it's not.
It's possible, because I think the larger industry is also thinking more about investments in inference as opposed to training.