Sam Altman
๐ค SpeakerAppearances Over Time
Podcast Appearances
One thing I have noticed, it's so deeply unimaginable to people to say, I don't really need more money. Like, and I get how toned up.
Well, yeah, yeah, yeah. No, so it assumes.
Well, yeah, yeah, yeah. No, so it assumes.
If I were just trying to say, like, I'm going to try to make a trillion dollars with open AI, I think everybody would have an easier time and it wouldn't save me. It would save a lot of conspiracy theories.
If I were just trying to say, like, I'm going to try to make a trillion dollars with open AI, I think everybody would have an easier time and it wouldn't save me. It would save a lot of conspiracy theories.
So the things like, you know, device companies or if we were doing some chip fab company, it's like those are not SAM project. Those would be like opening. I would get that equity. They would.
So the things like, you know, device companies or if we were doing some chip fab company, it's like those are not SAM project. Those would be like opening. I would get that equity. They would.
Well, that's not like kind of the people like you who have to like commentate on this stuff all day's perception, which is fair because we haven't announced this stuff because it's not done. I don't think most people in the world like are thinking about this, but I agree it spins up a lot of conspiracies. conspiracy theories in like tech commentators.
Well, that's not like kind of the people like you who have to like commentate on this stuff all day's perception, which is fair because we haven't announced this stuff because it's not done. I don't think most people in the world like are thinking about this, but I agree it spins up a lot of conspiracies. conspiracy theories in like tech commentators.
And if I could go back, yeah, I would just say like, let me take equity and make that super clear. And then everyone would be like, all right. Like I'd still be doing it because I really care about AGI and think this is like the most interesting work in the world. But it would at least hype check to everybody.
And if I could go back, yeah, I would just say like, let me take equity and make that super clear. And then everyone would be like, all right. Like I'd still be doing it because I really care about AGI and think this is like the most interesting work in the world. But it would at least hype check to everybody.
I don't know where that came from, actually. I genuinely don't. I think the world needs a lot more AI infrastructure, a lot more than it's currently planning to build and with a different cost structure. The exact way for us to play there is we're still trying to figure that out. Got it.
I don't know where that came from, actually. I genuinely don't. I think the world needs a lot more AI infrastructure, a lot more than it's currently planning to build and with a different cost structure. The exact way for us to play there is we're still trying to figure that out. Got it.
Oh, I have to go in a minute. It's not because... It's not to prevent the edge cases that we need to be more organized, but it is that these systems are so complicated and concentrating bets are so important. Like one...
Oh, I have to go in a minute. It's not because... It's not to prevent the edge cases that we need to be more organized, but it is that these systems are so complicated and concentrating bets are so important. Like one...
You know, at the time, before it was, like, obvious to do this, you have, like, DeepMind or whatever has all these different teams doing all these different things, and they're spreading their bets out. And you had OpenAI say, we're going to, like, basically put the whole company and work together to make GPT-4. And that was, like, unimaginable for how to run an AI research lab. But it is...
You know, at the time, before it was, like, obvious to do this, you have, like, DeepMind or whatever has all these different teams doing all these different things, and they're spreading their bets out. And you had OpenAI say, we're going to, like, basically put the whole company and work together to make GPT-4. And that was, like, unimaginable for how to run an AI research lab. But it is...
I think what works, at a minimum, it's what works for us. So not because we're trying to prevent edge cases, but because we want to concentrate resources and do these big, hard, complicated things, we do have a lot of coordination on what we work on.
I think what works, at a minimum, it's what works for us. So not because we're trying to prevent edge cases, but because we want to concentrate resources and do these big, hard, complicated things, we do have a lot of coordination on what we work on.
Great talking to you guys. Yeah, it was fun. Thanks for coming out.