Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Sam Altman

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
3367 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And in that sense, I do think it'll be a moment where a lot of the world went from not believing to believing. That was more about the ChatGPT interface than the... And by the interface and product, I also mean the post-training of the model and how we tune it to be helpful to you and how to use it, then the underlying model itself.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

I mean, they're both super important, but the RLHF, the post-training step, the little wrapper of things that, from a compute perspective, little wrapper of things that we do on top of the base model, even though it's a huge amount of work, that's really important to say nothing of the product that we build around it. In some sense, we did have to do two things.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

I mean, they're both super important, but the RLHF, the post-training step, the little wrapper of things that, from a compute perspective, little wrapper of things that we do on top of the base model, even though it's a huge amount of work, that's really important to say nothing of the product that we build around it. In some sense, we did have to do two things.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

I mean, they're both super important, but the RLHF, the post-training step, the little wrapper of things that, from a compute perspective, little wrapper of things that we do on top of the base model, even though it's a huge amount of work, that's really important to say nothing of the product that we build around it. In some sense, we did have to do two things.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

We had to invent the underlying technology, and then we had to figure out how to make it into a product people would love, which is not just about the actual product work itself, but this whole other step of how you align and make it useful.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

We had to invent the underlying technology, and then we had to figure out how to make it into a product people would love, which is not just about the actual product work itself, but this whole other step of how you align and make it useful.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

We had to invent the underlying technology, and then we had to figure out how to make it into a product people would love, which is not just about the actual product work itself, but this whole other step of how you align and make it useful.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And that. But... You know, that was like a known difficult thing. Like we knew we were going to have to scale it up. We had to go do two things that had like never been done before that were both like I would say quite significant achievements. And then a lot of things like scaling it up that other companies have had to do before.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And that. But... You know, that was like a known difficult thing. Like we knew we were going to have to scale it up. We had to go do two things that had like never been done before that were both like I would say quite significant achievements. And then a lot of things like scaling it up that other companies have had to do before.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And that. But... You know, that was like a known difficult thing. Like we knew we were going to have to scale it up. We had to go do two things that had like never been done before that were both like I would say quite significant achievements. And then a lot of things like scaling it up that other companies have had to do before.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

Most people don't need all the way to 128 most of the time, although if we dream into the distant future, like way distant future, we'll have context length of several billion. You will feed in all of your information, all of your history over time, and it'll just get to know you better and better, and that'll be great. So for now, the way people use these models, they're not doing that.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

Most people don't need all the way to 128 most of the time, although if we dream into the distant future, like way distant future, we'll have context length of several billion. You will feed in all of your information, all of your history over time, and it'll just get to know you better and better, and that'll be great. So for now, the way people use these models, they're not doing that.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

Most people don't need all the way to 128 most of the time, although if we dream into the distant future, like way distant future, we'll have context length of several billion. You will feed in all of your information, all of your history over time, and it'll just get to know you better and better, and that'll be great. So for now, the way people use these models, they're not doing that.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And, you know, people sometimes post in a paper or, you know, a significant fraction of a code repository or whatever. But most usage of the models is not using the long context most of the time.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And, you know, people sometimes post in a paper or, you know, a significant fraction of a code repository or whatever. But most usage of the models is not using the long context most of the time.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And, you know, people sometimes post in a paper or, you know, a significant fraction of a code repository or whatever. But most usage of the models is not using the long context most of the time.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

I saw this internet clip once. I'm going to get the numbers wrong, but it was like Bill Gates talking about the amount of memory on some early computer. Maybe 64K, maybe 640K, something like that. And most of it was used for the screen buffer.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

I saw this internet clip once. I'm going to get the numbers wrong, but it was like Bill Gates talking about the amount of memory on some early computer. Maybe 64K, maybe 640K, something like that. And most of it was used for the screen buffer.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

I saw this internet clip once. I'm going to get the numbers wrong, but it was like Bill Gates talking about the amount of memory on some early computer. Maybe 64K, maybe 640K, something like that. And most of it was used for the screen buffer.

Lex Fridman Podcast
#419 โ€“ Sam Altman: OpenAI, GPT-5, Sora, Board Saga, Elon Musk, Ilya, Power & AGI

And he just couldn't seem genuine in this, couldn't imagine that the world would eventually need gigabytes of memory in a computer or terabytes of memory in a computer. And you always do. Or you always do just need to like follow the exponential of technology. And we're going to like, we will find out how to use better technology.