Michael Levin
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
And so we can measure that now with metrics of causal emergence like phi and things like that.
So we know that in order to learn, you have to have significant phi.
But I wanted to ask the opposite question.
What does learning do for your phi level?
Does it do anything for your degree of being an agent that is more than the sum of its parts?
So we train the networks.
And sure enough, some of them, not all of them, but some of them, as you train them, their phi goes up.
Okay.
And so basically what we were able to find is that there is this positive feedback loop between every time you learn something, you become more of an integrated agent.
And every time you do that, it becomes easier to learn.
And so it's this virtuous cycle.
It's a virtuous cycle.
It's an asymmetry that points upwards for agency and intelligence.
And now back to our platonic space stuff.
Where does that come from?
It doesn't come from evolution.
You don't need to have any evolution for this.
Evolution will optimize the crap out of it for sure, but you don't need evolution to have this.
It doesn't come from physics.
It comes from the rules of information, the causal information theory, and the behavior of networks, the mathematical objects.