Nathan Lambert
๐ค SpeakerAppearances Over Time
Podcast Appearances
It's research salaries, right? All these things are counted in the billions of dollars that OpenAI is spending, but they weren't counted in the, hey, $6 million, $5 million that DeepSeq spent, right? So there's a bit of misunderstanding of what these numbers are. And then there's also an element of NVIDIA has just been a straight line up, right?
It's research salaries, right? All these things are counted in the billions of dollars that OpenAI is spending, but they weren't counted in the, hey, $6 million, $5 million that DeepSeq spent, right? So there's a bit of misunderstanding of what these numbers are. And then there's also an element of NVIDIA has just been a straight line up, right?
And there's been so many different narratives that have been trying to push down NVIDIA. I don't say push down NVIDIA stock. Everyone is looking for a reason to sell or to be worried, right? You know, it was Blackwell delays, right? Their GPU, you know, there's a lot of report. Every two weeks, there's a new report about their GPUs being delayed. There's...
And there's been so many different narratives that have been trying to push down NVIDIA. I don't say push down NVIDIA stock. Everyone is looking for a reason to sell or to be worried, right? You know, it was Blackwell delays, right? Their GPU, you know, there's a lot of report. Every two weeks, there's a new report about their GPUs being delayed. There's...
And there's been so many different narratives that have been trying to push down NVIDIA. I don't say push down NVIDIA stock. Everyone is looking for a reason to sell or to be worried, right? You know, it was Blackwell delays, right? Their GPU, you know, there's a lot of report. Every two weeks, there's a new report about their GPUs being delayed. There's...
There's the whole thing about scaling laws ending, right? It's so ironic, right? It lasted a month. It was just like literally just, hey, models aren't getting better, right? They're just not getting better. There's no reason to spend more. Pre-training scaling is dead. And then it's like, oh, one, oh, three, right? R1. R1, right?
There's the whole thing about scaling laws ending, right? It's so ironic, right? It lasted a month. It was just like literally just, hey, models aren't getting better, right? They're just not getting better. There's no reason to spend more. Pre-training scaling is dead. And then it's like, oh, one, oh, three, right? R1. R1, right?
There's the whole thing about scaling laws ending, right? It's so ironic, right? It lasted a month. It was just like literally just, hey, models aren't getting better, right? They're just not getting better. There's no reason to spend more. Pre-training scaling is dead. And then it's like, oh, one, oh, three, right? R1. R1, right?
And now it's like, wait, models are getting too, they're progressing too fast. Slow down the progress. Stop spending on GPUs, right? But, you know, the funniest thing I think that like comes out of this is Javon's paradox is true, right? AWS pricing for H100s has gone up over the last couple of weeks, right? Since a little bit after Christmas, since V3 was launched, AWS H100 pricing has gone up.
And now it's like, wait, models are getting too, they're progressing too fast. Slow down the progress. Stop spending on GPUs, right? But, you know, the funniest thing I think that like comes out of this is Javon's paradox is true, right? AWS pricing for H100s has gone up over the last couple of weeks, right? Since a little bit after Christmas, since V3 was launched, AWS H100 pricing has gone up.
And now it's like, wait, models are getting too, they're progressing too fast. Slow down the progress. Stop spending on GPUs, right? But, you know, the funniest thing I think that like comes out of this is Javon's paradox is true, right? AWS pricing for H100s has gone up over the last couple of weeks, right? Since a little bit after Christmas, since V3 was launched, AWS H100 pricing has gone up.
H200s are like almost out of stock everywhere because H200 has more memory and therefore R1 wants that chip over H100, right?
H200s are like almost out of stock everywhere because H200 has more memory and therefore R1 wants that chip over H100, right?
H200s are like almost out of stock everywhere because H200 has more memory and therefore R1 wants that chip over H100, right?
Right. And semiconductors is, you know, we're at 50 years of Moore's law. Every two years, half the cost, double the transistors, just like clockwork. And it's slowed down, obviously. But like the semiconductor industry has gone up the whole time. Right. It's been wavy. Right. There's obviously cycles and stuff. And I don't expect AI to be any different. Right. There's going to be ebbs and flows.
Right. And semiconductors is, you know, we're at 50 years of Moore's law. Every two years, half the cost, double the transistors, just like clockwork. And it's slowed down, obviously. But like the semiconductor industry has gone up the whole time. Right. It's been wavy. Right. There's obviously cycles and stuff. And I don't expect AI to be any different. Right. There's going to be ebbs and flows.
Right. And semiconductors is, you know, we're at 50 years of Moore's law. Every two years, half the cost, double the transistors, just like clockwork. And it's slowed down, obviously. But like the semiconductor industry has gone up the whole time. Right. It's been wavy. Right. There's obviously cycles and stuff. And I don't expect AI to be any different. Right. There's going to be ebbs and flows.
But this is an AI. It's just playing out at an insane timescale. Right. It was 2x every two years. This is 1200x in like three years. So it's like the scale of improvement that is hard to wrap your head around.
But this is an AI. It's just playing out at an insane timescale. Right. It was 2x every two years. This is 1200x in like three years. So it's like the scale of improvement that is hard to wrap your head around.
But this is an AI. It's just playing out at an insane timescale. Right. It was 2x every two years. This is 1200x in like three years. So it's like the scale of improvement that is hard to wrap your head around.