Andrej Karpathy
๐ค SpeakerAppearances Over Time
Podcast Appearances
And they will say stuff that will shock you because it's kind of, you can see where they're coming from, but it's just not the thing people say.
Yeah.
And because they're not yet collapsed.
But we're collapsed, we end up revisiting the same thoughts, we end up saying more and more of the same stuff, and the learning rates go down, and the collapse continues to get worse, and then everything deteriorates.
It's an interesting idea.
I mean, I do think that...
When you're generating things in your head and then you're attending to it, you're kind of like training on your own samples.
You're training on your synthetic data.
And if you do it for too long, you go off rails and you collapse way too much.
So you always have to like seek entropy in your life.
So talking to other people is a great source of entropy and things like that.
So maybe the brain has also built some internal mechanisms for increasing the amount of entropy in that process.
But yeah, maybe that's an interesting idea.
I think there's something very interesting about that.
Yeah, 100%.
I do think that humans actually, they do kind of like have a lot more of an element compared to LLMs of like seeing the forest for the trees.
And we're not actually that good at memorization, which is actually a feature.
Because we're not that good at memorization, we actually are kind of like forced to...
to find the patterns in a marginal sense.
I think LLNs, in comparison, are extremely good at memorization.