John Siracusa
๐ค SpeakerAppearances Over Time
Podcast Appearances
You won't have any problem with this.
Supports up to four external displays over a single Thunderbolt port.
It's pretty amazing.
It's a pretty beefy GPU for what, as we will see, is a laptop GPU.
want to pick a max which is what i've had i had an m1 max and m3 max right now but i want to pick the max for the increased memory bandwidth and i don't maybe not no not for what you do with it absolutely not fair all right well that answers that well that's why it's a good year for people who do not care about gpu this is the year to get the pro because it's back at the old days where you're all you're all you're giving up if you don't get the bend one is you're giving up gpu that you aren't going to use anyway hey that's less heat and everything so it's it's a win
Yeah, I would bet against it.
I mean, there is a possibility because these companies are highly motivated to reduce the cost of inference.
So basically, can we get away to run models with less power, like less CPU power?
They're highly motivated to do that because it saves them a lot of money, right?
if there is a breakthrough in that area if they say oh we've found a way to run models that work like our current ones but use half the ram or something like that is the only way that you should that you will be get the benefits of like future benefits of running local because right now unless you already know that you want to run if you're you know if you're not already running local models
don't buy a machine for the purposes of running local models.
Because if you buy this machine today, right now models are just getting bigger and bigger and harder and harder to run locally if you're talking about the bleeding edge models.
If you care about the bleeding edge models, don't run them on your laptop, right?
You have to run them in a server because a lot of them you just literally can't run because you're not going to have enough memory, right?
So unless there's a breakthrough in that or unless you're very interested in running small local models,
Again, you're probably already doing that.
So if you're not currently running local models, do not buy an M5 Mac Pro based on your ability to run local models, because if you're not already doing it in the future, it's you're just going to be less likely to do it because the models are just going to get bigger and more demanding and.
honestly if you know most people probably should be using the bigger models because they're just more capable and you don't want to run them yourself like they're so demanding like there's a reason those you know those gpus that they're running them on cost like 60 grand or whatever and there's multiple ones and
You don't want to be doing that yourself unless you're already doing it.