Matt Day
๐ค SpeakerAppearances Over Time
Podcast Appearances
Well, for Microsoft, it's really important.
They got a bit later jump than their biggest rivals, Amazon and Google, in building their own silicon.
They see this as an important way to reduce costs, to find another source of availability.
So this is going to be a really big test for their chips unit and for their cloud business.
So NVIDIA is the workhorse that powers the vast, vast majority of the AI workloads in Microsoft's data centers.
If you're using OpenAI as a ChatGPT consumer, chances are you're pinging an NVIDIA chip running in some Microsoft data center somewhere.
Now, like everybody in the industry, Microsoft would love to have some other options, hence this effort with Maya to try to build a better mousetrap, one more suited for their data centers and one that ideally could reduce some of their costs in the long term.
So it's not clear when customers are going to get their hands on this.
The first units are live today, Microsoft tells us, in data centers near Des Moines, Iowa.
More are coming to the Phoenix area.
But we don't have a date yet for what a wider rollout looks like or sort of crucially what availability looks like.
Is this going to be a global rollout?
Is this more of a kind of R&D step?
And future chips are going to sort of carry the mail for them.
So it's kind of yet to be seen how broad they take this one.
Investors are really banking on incredible growth.
Well, for Microsoft, it's really important.
They got a bit later jump than their biggest rivals, Amazon and Google, in building their own silicon.
They see this as an important way to reduce costs, to find another source of availability.
So this is going to be a really big test for their chips unit and for the cloud business.