Judea Pearl
๐ค SpeakerAppearances Over Time
Podcast Appearances
We have been deflected by the effect of LLMs.
You have low-flying fruits and everybody is excited, which is fine.
I mean, they're doing a tremendously impressive job.
But I don't think they take us toward AGI.
No, no, no, no, no, no.
More data and a scale up, it's all, I don't think it's going to lead over the hump that we need to cross.
There are certain limitations, mathematical limitations, that are not crossable by scaling up.
I show it clearly mathematically in my book.
And what LLMs do right now is they summarize world models authored by people like you and me, available on the web.
and they do some sort of mysterious summary of it, rather than discovering those world models directly from the data.
To give you an example, if you have data coming from hospitals about the effect of treatments,
You don't fit it directly into the LLMs today.
The input is interpretation of that data, authored by doctors, physicians, and people who already have a world model of body disease and what it does.
Here you have a limitation.
You have the limitation defined by the ladder of causation.
There is something that you cannot do if you don't have a certain input.
For instance, you cannot get causation from correlation.
That is well established, okay?
No one would deny even satisfaction by that.
And you cannot get interpretation from intervention.