Dwarkesh Patel
๐ค SpeakerAppearances Over Time
Podcast Appearances
Even animals can take on everything at once, right?
Animals are maybe a better example because they don't even have the scaffold of language.
They just get thrown out into the world and they just have to make sense of everything without any labels.
And the vision for AGI then should just be something which just looks at sensory data, looks at the computer screen, and it just figures out what's going on from scratch.
I mean, if a human was put in a similar situation, that would be trained from scratch.
But I mean, this is like a human growing up or an animal growing up.
So why shouldn't that be the vision for AI rather than this thing where we're doing millions of years of training?
I think that's a really good question.
Can you repeat the last sentence?
A lot of that intelligence is not motor tasks, that's what, sorry?
I'm going to take a second to digest that because there's a lot of different ideas.
Maybe one clarifying question I can ask to understand the perspective.
So I think you suggest that, look, evolution is doing the kind of thing that pre-training does in the sense of building something which can then understand the world.
The difference, I guess, is that evolution...
has to be titrated in the case of humans through three gigabytes of DNA.
And so that's very unlike the weights of a model.
I mean, literally the weights of the model are a brain, which obviously is not encoded in the sperm and the egg, or does not exist in the sperm and the egg.
So it has to be grown.
And also the information for every single synapse in the brain simply cannot exist in the three gigabytes that exist in the DNA.
Evolution seems closer to finding the algorithm