Reid Hoffman
๐ค SpeakerAppearances Over Time
Podcast Appearances
I like just literally a field of interesting companies. And I think that the public will benefit a lot from that, just kind of like the way they benefit a lot from, you know, Wikipedia, the Internet, you know, free communications and a bunch of other things. But I don't think it's necessarily kind of, you know, kind of completely broadly dispersed.
I like just literally a field of interesting companies. And I think that the public will benefit a lot from that, just kind of like the way they benefit a lot from, you know, Wikipedia, the Internet, you know, free communications and a bunch of other things. But I don't think it's necessarily kind of, you know, kind of completely broadly dispersed.
I like just literally a field of interesting companies. And I think that the public will benefit a lot from that, just kind of like the way they benefit a lot from, you know, Wikipedia, the Internet, you know, free communications and a bunch of other things. But I don't think it's necessarily kind of, you know, kind of completely broadly dispersed.
Yeah, happy to do it. By the way, one of the benefits of me being able to speculate is I have no internal information from either Microsoft or OpenAI, so I can speak entirely as a outside commentator just looking at this. So DeepSeek released a highly competent model
Yeah, happy to do it. By the way, one of the benefits of me being able to speculate is I have no internal information from either Microsoft or OpenAI, so I can speak entirely as a outside commentator just looking at this. So DeepSeek released a highly competent model
Yeah, happy to do it. By the way, one of the benefits of me being able to speculate is I have no internal information from either Microsoft or OpenAI, so I can speak entirely as a outside commentator just looking at this. So DeepSeek released a highly competent model
from China and part of the reason I took the market by storm is the thesis was that it was created for a lot less money and a lot less compute. What I think is there's certainly some parts of the story that are incorrect. The thing that we're trying to figure out is which parts of it are incorrect.
from China and part of the reason I took the market by storm is the thesis was that it was created for a lot less money and a lot less compute. What I think is there's certainly some parts of the story that are incorrect. The thing that we're trying to figure out is which parts of it are incorrect.
from China and part of the reason I took the market by storm is the thesis was that it was created for a lot less money and a lot less compute. What I think is there's certainly some parts of the story that are incorrect. The thing that we're trying to figure out is which parts of it are incorrect.
And I would speculate with some vigor that they actually had some version of access to larger models in helping training. Because this is actually something we all knew already last year, year before, is that large models will help train small models. And so that means when you train a small model effectively,
And I would speculate with some vigor that they actually had some version of access to larger models in helping training. Because this is actually something we all knew already last year, year before, is that large models will help train small models. And so that means when you train a small model effectively,
And I would speculate with some vigor that they actually had some version of access to larger models in helping training. Because this is actually something we all knew already last year, year before, is that large models will help train small models. And so that means when you train a small model effectively,
but you need a large model in order to do it, that's actually not disproving the need for these scale systems because when you have the better and better large scale models, you'll be able to train also really, really good.
but you need a large model in order to do it, that's actually not disproving the need for these scale systems because when you have the better and better large scale models, you'll be able to train also really, really good.
but you need a large model in order to do it, that's actually not disproving the need for these scale systems because when you have the better and better large scale models, you'll be able to train also really, really good.
Exactly.
Exactly.
Exactly.
So I would hazard strongly that there's something like that in the background. Now, it could be that they had some access to chat GBT, certainly some of the data and evidence suggests that in terms of the way that it answers and does certain things. It could be that they actually had access to a compute cluster of of size because the so-called training run really makes sense.
So I would hazard strongly that there's something like that in the background. Now, it could be that they had some access to chat GBT, certainly some of the data and evidence suggests that in terms of the way that it answers and does certain things. It could be that they actually had access to a compute cluster of of size because the so-called training run really makes sense.