Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Andrej Karpathy

๐Ÿ‘ค Speaker
3433 total appearances

Appearances Over Time

Podcast Appearances

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

When you're generating things in your head and then you're attending to it, you're kind of like training on your own samples.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

You're training on your synthetic data.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

And if you do it for too long, you go off rails and you collapse way too much.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

So you always have to like seek entropy in your life.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

So talking to other people is a great source of entropy and things like that.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

So maybe the brain has also built some internal mechanisms for increasing the amount of entropy in that process.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

But yeah, maybe that's an interesting idea.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

I think there's something very interesting about that.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

Yeah, 100%.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

I do think that humans actually, they do kind of like have a lot more of an element compared to LLMs of like seeing the forest for the trees.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

And we're not actually that good at memorization, which is actually a feature.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

Because we're not that good at memorization, we actually are kind of like forced to...

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

to find the patterns in a marginal sense.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

I think LLNs, in comparison, are extremely good at memorization.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

They will recite passages from all these training sources.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

You can give them completely nonsensical data, like you can hash some amount of text or something like that.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

You get a completely random sequence.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

If you train on it, even just, I think, a single iteration or two, it can suddenly regurgitate the entire thing.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

It will memorize it.

Dwarkesh Podcast
Andrej Karpathy โ€” AGI is still a decade away

There's no way a person can read a single sequence of random numbers and recite it to you.