Reid Hoffman
๐ค SpeakerAppearances Over Time
Podcast Appearances
from China and part of the reason I took the market by storm is the thesis was that it was created for a lot less money and a lot less compute. What I think is there's certainly some parts of the story that are incorrect. The thing that we're trying to figure out is which parts of it are incorrect.
from China and part of the reason I took the market by storm is the thesis was that it was created for a lot less money and a lot less compute. What I think is there's certainly some parts of the story that are incorrect. The thing that we're trying to figure out is which parts of it are incorrect.
And I would speculate with some vigor that they actually had some version of access to larger models in helping training. Because this is actually something we all knew already last year, year before, is that large models will help train small models. And so that means when you train a small model effectively,
And I would speculate with some vigor that they actually had some version of access to larger models in helping training. Because this is actually something we all knew already last year, year before, is that large models will help train small models. And so that means when you train a small model effectively,
And I would speculate with some vigor that they actually had some version of access to larger models in helping training. Because this is actually something we all knew already last year, year before, is that large models will help train small models. And so that means when you train a small model effectively,
but you need a large model in order to do it, that's actually not disproving the need for these scale systems because when you have the better and better large scale models, you'll be able to train also really, really good.
but you need a large model in order to do it, that's actually not disproving the need for these scale systems because when you have the better and better large scale models, you'll be able to train also really, really good.
but you need a large model in order to do it, that's actually not disproving the need for these scale systems because when you have the better and better large scale models, you'll be able to train also really, really good.
Exactly.
Exactly.
Exactly.
So I would hazard strongly that there's something like that in the background. Now, it could be that they had some access to chat GBT, certainly some of the data and evidence suggests that in terms of the way that it answers and does certain things. It could be that they actually had access to a compute cluster of of size because the so-called training run really makes sense.
So I would hazard strongly that there's something like that in the background. Now, it could be that they had some access to chat GBT, certainly some of the data and evidence suggests that in terms of the way that it answers and does certain things. It could be that they actually had access to a compute cluster of of size because the so-called training run really makes sense.
So I would hazard strongly that there's something like that in the background. Now, it could be that they had some access to chat GBT, certainly some of the data and evidence suggests that in terms of the way that it answers and does certain things. It could be that they actually had access to a compute cluster of of size because the so-called training run really makes sense.
And I cross check this across multiple groups, you know, like outside groups saying, hey, you know, what is it? What makes sense here? And they're like, yeah, for the final training run on a serious compute cluster, that could be the dollars that was that was spent on this in order to make it happen. Doesn't include talent, doesn't include all these other things.
And I cross check this across multiple groups, you know, like outside groups saying, hey, you know, what is it? What makes sense here? And they're like, yeah, for the final training run on a serious compute cluster, that could be the dollars that was that was spent on this in order to make it happen. Doesn't include talent, doesn't include all these other things.
And I cross check this across multiple groups, you know, like outside groups saying, hey, you know, what is it? What makes sense here? And they're like, yeah, for the final training run on a serious compute cluster, that could be the dollars that was that was spent on this in order to make it happen. Doesn't include talent, doesn't include all these other things.
Yeah, exactly. And I think that the... I think it's nearly certain that it's dependent upon the large-scale compute, the larger models in some way. The only question is we don't know in what ways and how. And I think that's one of the things that everyone's investigating. And so I think the kind of market frenzy on, oh my God, AI can be here without large-scale compute.
Yeah, exactly. And I think that the... I think it's nearly certain that it's dependent upon the large-scale compute, the larger models in some way. The only question is we don't know in what ways and how. And I think that's one of the things that everyone's investigating. And so I think the kind of market frenzy on, oh my God, AI can be here without large-scale compute.
Yeah, exactly. And I think that the... I think it's nearly certain that it's dependent upon the large-scale compute, the larger models in some way. The only question is we don't know in what ways and how. And I think that's one of the things that everyone's investigating. And so I think the kind of market frenzy on, oh my God, AI can be here without large-scale compute.