Tim Davis
๐ค SpeakerAppearances Over Time
Podcast Appearances
It's not like you just train a model and then amazingly, it just deploys everywhere.
That's not actually how the infrastructure works.
I think what we realized was, well, you know, what we really need to try to build is a platform where, you know, maybe that future is actually possible, where you could, in some ways, abstract away all this hardware complexity.
The idea that, you know, and I think this is a...
It goes into everything for modular, which is a unified compute layer, which I'm happy to talk about.
But I think one of the biggest sort of product lessons, one of the most interesting developer lessons from our time at Google was realizing, and this will be a controversial statement, so prepare yourself.
Most developers don't care about the hardware.
Great.
Because in a world where Nvidia is an incredible $4.5 billion company, it sounds weird to say that because in some ways the construct is, well, clearly the hardware is the most important thing.
Every developer that I've talked to, including myself building applications,
You know, what I care about is what is the throughput?
What is the latency?
What is the accuracy target?
And what is my cost threshold?
Once I have those, they're my inputs.
You know, fundamentally, the hardware is sort of the output that then has to meet the requirements.
And you end up having to, you know, in some ways be exposed to it because you end up not having many choices actually at the end that can meet those requirements.
And so we realized, well...
And we can get a little bit into what we, in our thesis on the future of super intelligence, but we strongly believed, and there was this concept at Google of what you train is what you serve.
And in today's world, that actually doesn't exist.