Sam Altman
๐ค SpeakerAppearances Over Time
Podcast Appearances
It was nice of them. And then like, you know, I kind of did like a few hours of just this like absolute fugue state in the hotel room. trying to like I was just confused beyond belief trying to figure out what to do and so weird and then like Flew home at maybe like, I don't know, 3 p.m. or something like that. Still just like, you know, crazy nonstop phone blowing up.
Met up with some people in person. By that evening, I was like, okay, you know, I'll just like go do AGI research and was feeling pretty happy about the future. Yeah, you have options. And then the next morning, had this call with a couple of board members about coming back and that led to a few more days of craziness. And then, uh, and then it kind of, I think it got resolved.
Met up with some people in person. By that evening, I was like, okay, you know, I'll just like go do AGI research and was feeling pretty happy about the future. Yeah, you have options. And then the next morning, had this call with a couple of board members about coming back and that led to a few more days of craziness. And then, uh, and then it kind of, I think it got resolved.
Well, it was like a lot of insanity in between.
Well, it was like a lot of insanity in between.
Um, well, we only have a nonprofit board, so it was all the nonprofit board members. Uh, there, the board had gotten down to six people. Um, they, uh, And then they removed Greg from the board and then fired me. So, but it was like, you know.
Um, well, we only have a nonprofit board, so it was all the nonprofit board members. Uh, there, the board had gotten down to six people. Um, they, uh, And then they removed Greg from the board and then fired me. So, but it was like, you know.
I think there's always been culture clashes at... Look, obviously... not all of those board members are my favorite people in the world, but I have serious respect for the gravity with which they treat AGI and the importance of getting AI safety right. And even if I stringently disagree with their decision making and actions, which I do, I have never once doubted their
I think there's always been culture clashes at... Look, obviously... not all of those board members are my favorite people in the world, but I have serious respect for the gravity with which they treat AGI and the importance of getting AI safety right. And even if I stringently disagree with their decision making and actions, which I do, I have never once doubted their
integrity or commitment to the sort of shared mission of safe and beneficial AGI. You know, do I think they, like, made good decisions in the process of that or kind of know how to balance all the things OpenAI has to get right? No. But I think the, like... The intent. The intent of the magnitude of AGI and getting that right
integrity or commitment to the sort of shared mission of safe and beneficial AGI. You know, do I think they, like, made good decisions in the process of that or kind of know how to balance all the things OpenAI has to get right? No. But I think the, like... The intent. The intent of the magnitude of AGI and getting that right
very afraid of AGI or very afraid of even current AI and very excited about it and even more afraid and even more excited about where it's going. And we We wrestle with that, but I think it is unavoidable that this is going to happen. I also think it's going to be tremendously beneficial. But we do have to navigate how to get there in a reasonable way.
very afraid of AGI or very afraid of even current AI and very excited about it and even more afraid and even more excited about where it's going. And we We wrestle with that, but I think it is unavoidable that this is going to happen. I also think it's going to be tremendously beneficial. But we do have to navigate how to get there in a reasonable way.
And a lot of stuff is going to change, and change is pretty uncomfortable for people. So there's a lot of pieces that we got to get right.
And a lot of stuff is going to change, and change is pretty uncomfortable for people. So there's a lot of pieces that we got to get right.
Yeah, I wish I had taken equity so I never had to answer this question.
Yeah, I wish I had taken equity so I never had to answer this question.
The decision back then, the original reason was just the structure of our nonprofit. There was something about... yeah, okay, this is like nice from a motivations perspective, but mostly it was that our board needed to be a majority of disinterested directors. And I was like, that's fine, I don't need equity right now. I kind of...
The decision back then, the original reason was just the structure of our nonprofit. There was something about... yeah, okay, this is like nice from a motivations perspective, but mostly it was that our board needed to be a majority of disinterested directors. And I was like, that's fine, I don't need equity right now. I kind of...
One thing I have noticed, it's so deeply unimaginable to people to say, I don't really need more money. Like, and I get how toned up.