Azeem Azhar
๐ค SpeakerAppearances Over Time
Podcast Appearances
And if you follow the news and the analysis, you'll see that that story comes up time and time again.
So this is a long-term cycle.
This is not just about large language models.
This is not just about
whether open AI can grow and can become profitable.
This is a fundamental shift in the economy, as fundamental as going from 1880, when nobody was really using electricity in the economy, to the 1930s when the US, it was the prime move of the bulk way in which factories and offices were getting their power.
More and more economic activity will move into computational systems, even when LLMs look completely long in the tooth and who would use an LLM in the same way that not many of us use penny-farthing bicycles to get to work.
And so it's really fascinating.
again to see how that is changing the narrative.
Now, some of this, I think, is just expediency.
It's just, let's take advantage of the changes.
But some, I think, is real.
We know, for example, that Google has done a deal with Commonwealth Fusion Systems for a 400 megawatt power tranche when their fusion reactor goes live in a few years.
And
Helion Energy, which is another fusion company, has ties with Microsoft to power data centers.
So this is a really, really significant problem.
It's a big issue in the US, much less of an issue in China, where they've mastered the ability to deliver clean electrons at scale.
There's also this squeeze coming in between inference, which is the bit of the AI activity that makes money, and training, which is when you're doing your product development for your next model.
Model companies will be battling between where do they put their resources into training the next model or into serving customers for revenues today.
They have lots of resources, but even those resources are not infinite.