Jeremiah
๐ค SpeakerAppearances Over Time
Podcast Appearances
Instead, evolution gives us algorithms that let us learn from experience.
These algorithms are a second optimization loop, evolving in quotes neuron patterns into forms that better promote fitness, reproduction, etc.
The most powerful such algorithm is called predictive coding, which neuroscience increasingly considers a key organizing principle of the brain.
Wikipedia describes it as, quote,
End quote.
Scott writes, in other words, the brain organizes itself and learns things by constantly trying to predict the next sense datum, then updating synaptic weights towards whatever form would have predicted the next sense datum most efficiently.
This is very close, not exact, analog to the next token prediction of AI.
This process organizes the brain into a form capable of predicting sense data called a world model.
For example, if you encounter a tiger, the best way of predicting the resulting sense data, the appearance of the tiger pouncing, the sound of the tiger's roar, the burst of pain at the tiger's jaws closing around your arm, is to know things about tigers.
On the highest and most abstract levels, these are things like tigers are orange, tigers often pounce, and tigers like to bite people.
On lower levels, they involve the ability to translate high-level facts like tigers often pounce into a probabilistic prediction of the tiger's exact trajectory.
All of this is done via neural circuits we don't entirely understand and implemented through the usual neuroscience stuff like synapses and neurotransmitters.
To you, it just feels like, I don't know, I thought about it and I realized the tiger would pounce over there.
Here's a picture of a tiger, ready to pounce.
3.
The AI's equivalent of evolution is the AI companies designing them.
Just like evolution, the AI companies realized that it was inefficient to hand-code everything the AIs needed to know โ giant look-up table โ and instead gave the AIs learning algorithms, deep learning.
As with humans, the most powerful of these learning algorithms was next-token prediction โ
This algorithm feeds the AI a stream of tokens, then updates the AI's innards into a form that would have predicted the next token efficiently.
But this doesn't mean the AI's innards look like, hmm, what will the next token be?