Yann LeCun
👤 PersonAppearances Over Time
Podcast Appearances
Well, I used to be a hardware guy many years ago. Decades ago.
Changed a little bit.
Changed a little bit.
Changed a little bit.
I mean, certainly scale is necessary, but not sufficient. Absolutely. So we certainly need computation. I mean, we're still far in terms of computer power. from what we would need to match the compute power of the human brain. This may occur in the next couple of decades, but we're still some ways away. And certainly in terms of power efficiency, we're really far.
I mean, certainly scale is necessary, but not sufficient. Absolutely. So we certainly need computation. I mean, we're still far in terms of computer power. from what we would need to match the compute power of the human brain. This may occur in the next couple of decades, but we're still some ways away. And certainly in terms of power efficiency, we're really far.
I mean, certainly scale is necessary, but not sufficient. Absolutely. So we certainly need computation. I mean, we're still far in terms of computer power. from what we would need to match the compute power of the human brain. This may occur in the next couple of decades, but we're still some ways away. And certainly in terms of power efficiency, we're really far.
So a lot of progress to make in hardware. And right now, a lot of the progress is not, I mean, there's a bit coming from silicon technology, but a lot of it coming from architectural innovation. And quite a bit coming from more efficient ways of implementing the architectures that have become popular, basically a combination of transformers and convnets, right?
So a lot of progress to make in hardware. And right now, a lot of the progress is not, I mean, there's a bit coming from silicon technology, but a lot of it coming from architectural innovation. And quite a bit coming from more efficient ways of implementing the architectures that have become popular, basically a combination of transformers and convnets, right?
So a lot of progress to make in hardware. And right now, a lot of the progress is not, I mean, there's a bit coming from silicon technology, but a lot of it coming from architectural innovation. And quite a bit coming from more efficient ways of implementing the architectures that have become popular, basically a combination of transformers and convnets, right?
So there's still some ways to go until... we're going to saturate, we're going to have to come up with new principles, new fabrication technology, new basic components, perhaps based on different principles than classical digital CMOS.
So there's still some ways to go until... we're going to saturate, we're going to have to come up with new principles, new fabrication technology, new basic components, perhaps based on different principles than classical digital CMOS.
So there's still some ways to go until... we're going to saturate, we're going to have to come up with new principles, new fabrication technology, new basic components, perhaps based on different principles than classical digital CMOS.
Well, if you want to make it ubiquitous, yeah, certainly. Because we're going to have to reduce the power consumption. A GPU today is half a kilowatt to a kilowatt. Human brain is about 25 watts. And a GPU is way below the power of the human brain. You need something like 100,000 or a million to match it. So we are off by a huge factor here.
Well, if you want to make it ubiquitous, yeah, certainly. Because we're going to have to reduce the power consumption. A GPU today is half a kilowatt to a kilowatt. Human brain is about 25 watts. And a GPU is way below the power of the human brain. You need something like 100,000 or a million to match it. So we are off by a huge factor here.
Well, if you want to make it ubiquitous, yeah, certainly. Because we're going to have to reduce the power consumption. A GPU today is half a kilowatt to a kilowatt. Human brain is about 25 watts. And a GPU is way below the power of the human brain. You need something like 100,000 or a million to match it. So we are off by a huge factor here.
So first of all, it's not going to be an event. The idea somehow, which is popularized by science fiction and Hollywood, that somehow somebody is going to discover the secret to AGI or human-level AI or AMI, whatever you want to call it, and then turn on a machine and then we have AGI. That's just not going to happen. It's not going to be an event. It's going to be gradual progress.
So first of all, it's not going to be an event. The idea somehow, which is popularized by science fiction and Hollywood, that somehow somebody is going to discover the secret to AGI or human-level AI or AMI, whatever you want to call it, and then turn on a machine and then we have AGI. That's just not going to happen. It's not going to be an event. It's going to be gradual progress.
So first of all, it's not going to be an event. The idea somehow, which is popularized by science fiction and Hollywood, that somehow somebody is going to discover the secret to AGI or human-level AI or AMI, whatever you want to call it, and then turn on a machine and then we have AGI. That's just not going to happen. It's not going to be an event. It's going to be gradual progress.
Are we going to have systems that can learn from video how the world works and learn good representations? Yeah. Before we get them to the scale and performance that we observe in humans, it's going to take quite a while. It's not going to happen in one day. Are we going to get systems that can have large amount of associative memory so they can remember stuff?