Marco Arment
๐ค SpeakerAppearances Over Time
Podcast Appearances
And the very, very largest models, the hardware to run those is so specialized.
that Apple not only isn't going to make something competitive for it, but shouldn't make something competitive for it, because that's fairly far out of their main areas of expertise.
You're looking at the highest-end servers from NVIDIA-type stuff, and also Google's custom stuff.
That's what this world mostly looks like.
And so Google has the whole stack, top to bottom, to serve, to develop these models, to host these models, to scale them,
Apple is not going to be competitive on that level for a long time, if ever.
Where Apple, I think, might be competitive long term is on device models.
That makes a lot of sense.
That is well within Apple's expertise.
You know, maybe not so much in the in the cutting edge of AI these days, but I feel like they can get there, you know, kind of reasonably in, you know, maybe five years.
They can probably have their own on device models being developed.
class leading enough to actually use only their stuff but for the big like world knowledge server models I don't think not only are they not in that game now either in the models or the hardware to serve them but I don't think they will ever be in that game because
And you could probably argue like they probably maybe shouldn't try a lot of those things.
So I think this is probably bigger than it seems.
And I think this part of it where Siri just runs on Google servers, I think that's going to just be the default of how this goes for the foreseeable future.
And also, I think that's probably the right move.
I can't imagine they would mention it.
But also, I wouldn't necessarily assume that private cloud compute, the way it was advertised, will be the situation forever.
Because everything we saw at that initial Apple Intelligence presentation, how much of that has actually panned out and will ever pan out?