Nathaniel Whittemore
๐ค SpeakerAppearances Over Time
Podcast Appearances
The latest report suggests that instead of looking to become one of Google's first large TPU customers, they are instead placing large orders from AMD's latest chips.
Poo claimed that this isn't a full replacement of Meta's fleet, but rather a strategic purchase to meet short-term requirements more efficiently.
He reported that Meta could still deploy their custom silicon at a later date with a focus on specialized workloads.
I think that the more interesting conversation is what this implies around a shift overall.
Alongside Meta, OpenAI and Anthropic launched custom silicon programs last year with an aim to reduce reliance on NVIDIA and AMD, but it seems increasingly unlikely that these custom silicon initiatives will make sense in the context of rapidly accelerating compute needs.
Some are even questioning whether there's any financial benefit to developing an in-house chip, with investor Nikolai Sgoninus posting, AMD's total cost of ownership and performance per watt in their latest chips beats out anything Meta can do internally and TPUs apparently too.
Last year was all about how NVIDIA and AMD could see erosion of market share.
Now it seems the hyperscalers won't have the luxury of seeking alternatives and could fall back on established players to keep up with demand.
In partnership news, OpenAI has signed a three-year deal to integrate their AI models into ServiceNow's platform.
The Wall Street Journal reported that ServiceNow users would be able to choose OpenAI's models within the platform, and the deal would involve a revenue commitment from ServiceNow.
OpenAI CEO Brad Lightcap told the journal, Enterprises want OpenAI intelligence applied directly into ServiceNow workflows.
Looking ahead, customers are especially interested in agentic and multimodal experiences so they can work with AI like a true teammate inside ServiceNow.
ServiceNow president Amit Zaveri said the integration will go way beyond backend optimizations.
He said that OpenAI's computer use agents will be granted access to IT tasks like restarting a computer remotely, essentially allowing them to function as automated IT support.
Zaveri said the agents could also help companies access data stuck in legacy systems like mainframe computers.
The computer use models are basically now doing this through learning and feeding it back into the ServiceNow workflow platform.
I think we're going to learn a lot this year about exactly how the agentic business model is going to shake out.
It is a very different approach to try to integrate your technology inside other delivery platforms like ServiceNow versus just trying to be the ServiceNow.
I don't think it's clear exactly how that plays out, but I think there's going to be a lot of experiments this year.
It also, however, continues to be a land grab for enterprise business, and I expect that to just do nothing but ramp up throughout the year.