Grant Harvey
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
There was an economist piece and Rohan Paul, who's like a really great AI Twitter personality, shared this.
So basically NVIDIA's Grace Blackwell chips use about 40 kilowatts for inference versus like 12 kilowatts for computing.
You can correct me if those numbers are off.
That's just what this tweet said.
And training jumps to 80 kilowatts.
So like, I guess from your perspective, how much of the energy efficiency problem is about the chips themselves versus the architecture?
Is it both?
What's your take on that?
Exactly.
So let's maybe zoom this out a little bit more.
So what I'm hearing is that you're pretty bullish on not necessarily needing bigger and bigger models, or maybe many more smarter, more efficient, more niche models.
Or do you think it's a combo of the two?
Where do you think this is all heading?
Is Samba Nova ever going to create your own models?
Have you considered that or have you created them?
Or are you just staying focused on infrastructure?
Or like Anthropix whole thing where Dario Almeda was basically like, yeah, you know, it's basically each model has its own business.
So it's like, yeah, maybe you train a model for 200 million and that's profitable, but then you have to go make another business, train a model for a billion, and then you have to make that profitable.
Agreed.
It's sort of like the dream of the custom GPT marketplace, but just like custom open source models everywhere.