Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Jaden Schaefer

๐Ÿ‘ค Speaker
1542 total appearances

Appearances Over Time

Podcast Appearances

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

And yes, it does cost a lot of money, it is very intense.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

But I think it's also important to remember, there are millions of people around the world using these AI models.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

And we also need to optimize the tech stack for people that are generating stuff.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

So

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

I think, well, training oftentimes gets a lot of kind of like the headlines and people talk about it a lot because it's basically this kind of massive upfront compute demand, right?

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

Like in order to train one of these models, you're spending millions and millions of dollars.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

I think inference is quietly becoming a really dominant cost center for a lot of these AI companies because their models are, you know, getting deployed to millions of users.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

That's chatbots.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

And then if you look at Google, that's like all of the search tools.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

You have copilots for Microsoft and a bunch of others and a lot of the enterprise software.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

So

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

every query autocomplete or, you know, generated paragraph, every bit of that is consuming compute power and cooling.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

So as a result, even like a very small efficiency gain at the chip level can translate into some really big cost savings at cloud scale.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

So it's interesting because this is, you know, obviously something Microsoft's concerned about, but every other AI company should be and is concerned about this as well, because they need to make those, you know, they need to make the cost savings, not just when they're training the model, but when they're actually generating stuff.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

Microsoft right now, they're betting that this new kind of Maya 200, it's going to be a really big shift in that financial equation.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

They said that the chip is going to be designed to essentially run today's largest frontier models.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

So you can imagine the ones that they partnered with, like OpenAI, and they're going to be able to do that on a single node while leaving enough headroom to accommodate larger and more demanding architecture in the future, which is kind of interesting, right?

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

They're not just looking at what is OpenAI, what do our AI models need today?

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

They're looking at what is it going to need in the future.

AI in Business
Microsoft Reveals Maya 200 AI Inference Chip

So