Alex Wagner
๐ค SpeakerAppearances Over Time
Podcast Appearances
So first of all, it's so funny because all of the worries about AI was that it was too expensive, right? People were seeing the fact that you have to spend these billions of dollars. Like OpenAI last year raised the biggest funding round in history at $6.6 billion. And the big complaint was, well, this... AI technology is too expensive to use.
So first of all, it's so funny because all of the worries about AI was that it was too expensive, right? People were seeing the fact that you have to spend these billions of dollars. Like OpenAI last year raised the biggest funding round in history at $6.6 billion. And the big complaint was, well, this... AI technology is too expensive to use.
They're losing billions just to run it and train it every year. And so therefore the industry is going to fall apart. Now everyone's worried because it's too cheap, which I think is just so funny. But basically, look, this is the way that the AI industry was always running. OpenAI's stated goal was to make intelligence that's too cheap to meter.
They're losing billions just to run it and train it every year. And so therefore the industry is going to fall apart. Now everyone's worried because it's too cheap, which I think is just so funny. But basically, look, this is the way that the AI industry was always running. OpenAI's stated goal was to make intelligence that's too cheap to meter.
They're losing billions just to run it and train it every year. And so therefore the industry is going to fall apart. Now everyone's worried because it's too cheap, which I think is just so funny. But basically, look, this is the way that the AI industry was always running. OpenAI's stated goal was to make intelligence that's too cheap to meter.
Basically, the idea was we want to be able to provide this stuff at a cost that is so inexpensive that you'll be able to do whatever your heart's desire is to build with AI. And, you know, really what these Chinese engineers have done is they have used some new techniques that have largely been like thanks to some of the constraints that they've had.
Basically, the idea was we want to be able to provide this stuff at a cost that is so inexpensive that you'll be able to do whatever your heart's desire is to build with AI. And, you know, really what these Chinese engineers have done is they have used some new techniques that have largely been like thanks to some of the constraints that they've had.
Basically, the idea was we want to be able to provide this stuff at a cost that is so inexpensive that you'll be able to do whatever your heart's desire is to build with AI. And, you know, really what these Chinese engineers have done is they have used some new techniques that have largely been like thanks to some of the constraints that they've had.
So they haven't been able to use the state of the art NVIDIA chips, which means this process that we're doing over here in the United States of just making the servers bigger and making, you know, adding more data has not been available to them. So they've had to introduce some tricks to make the models more efficient.
So they haven't been able to use the state of the art NVIDIA chips, which means this process that we're doing over here in the United States of just making the servers bigger and making, you know, adding more data has not been available to them. So they've had to introduce some tricks to make the models more efficient.
So they haven't been able to use the state of the art NVIDIA chips, which means this process that we're doing over here in the United States of just making the servers bigger and making, you know, adding more data has not been available to them. So they've had to introduce some tricks to make the models more efficient.
And I could get into all the technical details if you want, but basically the way to think about this is they have used the constraints to build a much more efficient model than anybody else has through some different techniques that have just been starting to roll out in the Western models, things called reasoning, reinforcement learning.
And I could get into all the technical details if you want, but basically the way to think about this is they have used the constraints to build a much more efficient model than anybody else has through some different techniques that have just been starting to roll out in the Western models, things called reasoning, reinforcement learning.
And I could get into all the technical details if you want, but basically the way to think about this is they have used the constraints to build a much more efficient model than anybody else has through some different techniques that have just been starting to roll out in the Western models, things called reasoning, reinforcement learning.
And they've just basically speed run the entire industry and found a way to offer this effectively same or better model than a lot of the cutting edge that we have today at a cheaper cost.
And they've just basically speed run the entire industry and found a way to offer this effectively same or better model than a lot of the cutting edge that we have today at a cheaper cost.
And they've just basically speed run the entire industry and found a way to offer this effectively same or better model than a lot of the cutting edge that we have today at a cheaper cost.
Compare and contrast it to ChatGPT for me. So I have played with it. The real innovation here has been the DeepSeek R1 model. And by the way, no AI company knows how to name anything.
Compare and contrast it to ChatGPT for me. So I have played with it. The real innovation here has been the DeepSeek R1 model. And by the way, no AI company knows how to name anything.
Compare and contrast it to ChatGPT for me. So I have played with it. The real innovation here has been the DeepSeek R1 model. And by the way, no AI company knows how to name anything.