Sam Altman
๐ค SpeakerAppearances Over Time
Podcast Appearances
GPT-4 has weirdly not been that much of an update for most people.
You know, they're like, oh, it's better than 3.5, but I thought it was going to be better than 3.5.
And it's cool.
But you know, this is like, someone said to me over the weekend,
you shipped an AGI and I somehow like, I'm just going about my daily life and I'm not that impressed.
And I obviously don't think we shipped an AGI, but I get the point and the world is continuing on.
I think there's like a bunch of interesting lessons from COVID and the UFO videos and a whole bunch of other stuff that we can talk to there.
But on the takeoff question, if we imagine a two by two matrix of short timelines till AGI starts, long timelines till AGI starts, slow takeoff, fast takeoff, do you have an instinct on what do you think the safest quadrant would be?
So the takeoff, we start the takeoff period.
Yep.
Next year or in 20 years.
20 years.
And then it takes one year or 10 years.
Well, you can even say one year or five years, whatever you want for the takeoff.
So do I. Longer now.
I'm in the slow, take-off short timelines.
It's the most likely good world, and we optimize the company to have maximum impact in that world, to try to push for that kind of a world.
And the decisions that we make are... You know, there's like probability masses, but weighted towards that.
And I think...
I'm very afraid of the fast takeoffs.