Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
No, I'm saying that if you go about asking all day long, like...
Do I have enough ego?
Do I have too much of an ego?
I think you get worse at making good predictions.
I think that to make good predictions, you're like, how did I think about this?
Did that work?
Should I do that again?
So like Robin Hanson and I debated AI systems, and I think that the person who won that debate was Guern, and I think that reality was like,
to the Yudkowskian side of the Yudkowsky-Hansen spectrum, further from Yudkowsky.
And I think that's because I was trying to sound reasonable compared to Hansen and saying things that were defensible relative to Hansen's arguments, and reality was way over here.
In particular, in respect to... Hansen was like, all the systems will be specialized.
Hansen may disagree with this characterization.
Hansen was like, all the systems will be specialized.
I was like, I think we build specialized underlying systems that when you combine them are good at a wide range of things.
And the reality is like, no, you just stack more layers and do a bunch of gradient descent.
And I feel looking back that by trying to have this reasonable position contrasted to Hansen's position...
I missed the ways that reality could be more extreme than my position in the same direction.
So is this a failure to have enough ego?
Is this a failure to make myself be independent?
I would say that this is something like a failure to consider positions that would sound even wackier and more extreme when people are already calling you extreme.