Grant
👤 PersonAppearances Over Time
Podcast Appearances
Yeah, so that's what that is.
Yeah, exactly.
So having a GPU then helps increase the level of both of those.
Gotcha.
Well, specifically about, like, the future of PCs, right?
Like, are we seeing a true shift in how computing power is used for AI workflows with what's going on right now?
And to put a larger framing on it, there was a great interview with Marc Andreessen on the Cheeky Pint podcast, and he basically said, like, comparing AI to the dot-com boom is probably the wrong comparison.
You should be comparing it to the PC era.
And, like, when PCs first hit the market and...
really started to take off because he calls this computing V2.
And that sounded right to me, but I'd like to take on that.
You're talking about the 120 billion parameter version?
And I think the same is true on the model side, right?
Like not only did OpenAI release the 120 billion version, they also released a 20 billion version.
And Google has been releasing a lot of...
uh you know small or that level gemma 3n for example yeah gemma 3n like really really small models that allegedly some of which you could even run on a phone i mean how you do that i think is still a little bit people are like yeah you can in theory but we don't know how um but there was like two two things that i wanted to follow up on that uh first is the the cloud point uh there was a great take i'm forgetting who it was but i'll credit them when in the comments uh
who was basically saying, like, we spent all this time putting all of all of our computing to the cloud, but then it sort of like defeats the purpose of the Internet being like decentralized.
Right.
And that like like now all of a sudden everything is like, you know, going off to like these, you know, couple of providers and it doesn't create that resiliency.
To your point, if like something gets hit, like does the whole eastern seaboard go down, you know, depending on where the data centers are.