Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
It's 50 years off.
Or they may be like, it's only a very tiny amount.
And, you know, the thing I would worry about is that if this is how things are scaling, then it jumping out ahead and trying not to be wrong in the same way that I've been wrong before.
Maybe GPT-5 is more unambiguously a general intelligence.
And maybe that is getting to a point where it is like even harder to turn back.
Not that it would be easy to turn back now, but, you know, maybe if you let it, if you like start integrating GPT-5 into the economy, it is even harder to turn back past there.
I was expecting more of that.
I am like the fact that GPT-4 is like kind of on the threshold and neither here nor there.
Like that itself is like not the sort of thing that โ not quite how I expected it to play out.
I was expecting there to be more of an issue, more of a sense of like different discoveries like the discovery of Transformers.
where you would stack them up and there would be like a final discovery.
And then you would like get something that was like more clearly general intelligence.
So the, the way that you are like taking what is probably basically the same architecture in GPT three and throwing 20 times as much computed it probably and getting out to GBT four.
And then it's like maybe just barely a general intelligence or like a narrow general intelligence or, you know, something we don't really have the words for.
Um,
Yeah, that's not quite how I expected it to play out.
It's definitely a big leap from GPT-3.
I mean, we do actually understand why the ReLUs make a big difference compared to Sigmoids.
But yes, they're probably using G4789 ReLUs, or whatever the acronyms are up to now, rather than ReLUs.
Yeah, that's part of the modern paradigm of alchemy.