Tim Davis
๐ค SpeakerAppearances Over Time
Podcast Appearances
But I would certainly say many of the decisions that were made at companies like Meta and Google to build these AI frameworks on top of CUDA as a programming model for NVIDIA's acceleratable compute
really acted as a massive distribution mechanism for them to get very mass scale.
It accelerated the adoption and penetration of execution of models on NVIDIA's hardware.
And if you look at other folks, at the time those decisions were made in frameworks like TensorFlow and PyTorch and others, it was the best thing around.
And so people went, well, let's just use that.
But by doing that, they obviously changed the...
You know, the trajectory, certainly for NVIDIA, and I'm sure many happy stockholders at that company, but also, you know, more broadly on how other companies can come and compete, right?
Yeah.
I think a lot of what we fundamentally believe is not only is a unified compute model good for penetration of applications, both at the data center and the edge, it's also good just for competition.
It makes it possible for more people to compete and there'll be more innovation at the silicon side of it.
But Grant, you also asked,
You know, all of these deals and all of this sort of circular money flow.
Right.
You know, I think there's been parallels drawn and there's a guy, Tomas Tungs, who's a former VC and he has a blog.
He has a wonderful blog post on this.
Sort of comparing it back to, you know, when we were laying fiber back in, you know, in 2000s.
And I think the difference, of course, now is I don't...
I think that GPU utilization around the world right now, most GPUs are probably running reasonably hot, given the amount of AI that's being consumed.
I think it is a very different time to what was happening, certainly in America, when we were laying fiber, and it just wasn't being utilized yet.
And then, of course, across the arc of 20 years, that's changed.