Victor Riparbelli
๐ค SpeakerAppearances Over Time
Podcast Appearances
Like I think most people, we will interact with models like we do in ChatGPT or whatever today for sure. But a lot of like the things that you won't notice as a consumer, when you're just using someone else's platform or product. There'll be a lot of workflows in the background that are also driven by LLMs. You'll definitely see a lot of companies building their own specialized LLMs, right?
Like I think most people, we will interact with models like we do in ChatGPT or whatever today for sure. But a lot of like the things that you won't notice as a consumer, when you're just using someone else's platform or product. There'll be a lot of workflows in the background that are also driven by LLMs. You'll definitely see a lot of companies building their own specialized LLMs, right?
But I think we should think more about products than models.
But I think we should think more about products than models.
But I think we should think more about products than models.
it's going to be incredibly difficult to do anything that can mirror the chat GPT moment when it first came out. As humans, I think we saturate incredibly easy. We often forget how crazy just like GPT-free actually is when you use it, right?
it's going to be incredibly difficult to do anything that can mirror the chat GPT moment when it first came out. As humans, I think we saturate incredibly easy. We often forget how crazy just like GPT-free actually is when you use it, right?
it's going to be incredibly difficult to do anything that can mirror the chat GPT moment when it first came out. As humans, I think we saturate incredibly easy. We often forget how crazy just like GPT-free actually is when you use it, right?
So my sense is that when they release something, it'll probably be good and there'll probably be a whole bunch of tech nerds will be very excited about all the things you can now do. And maybe I'm wrong and it's just like super intelligent thing. They'll just like self-improve and save the world. But I think it'll come out and it'll be like good.
So my sense is that when they release something, it'll probably be good and there'll probably be a whole bunch of tech nerds will be very excited about all the things you can now do. And maybe I'm wrong and it's just like super intelligent thing. They'll just like self-improve and save the world. But I think it'll come out and it'll be like good.
So my sense is that when they release something, it'll probably be good and there'll probably be a whole bunch of tech nerds will be very excited about all the things you can now do. And maybe I'm wrong and it's just like super intelligent thing. They'll just like self-improve and save the world. But I think it'll come out and it'll be like good.
It'll be definitely better than the previous things. But I think most people will not like care that much because the bar for someone to really care that much is incredibly high, right? And there's all this talk about this is like AGI, it's not AGI. And I think all those things kind of riddled with like, well, what's the definition of AGI, right?
It'll be definitely better than the previous things. But I think most people will not like care that much because the bar for someone to really care that much is incredibly high, right? And there's all this talk about this is like AGI, it's not AGI. And I think all those things kind of riddled with like, well, what's the definition of AGI, right?
It'll be definitely better than the previous things. But I think most people will not like care that much because the bar for someone to really care that much is incredibly high, right? And there's all this talk about this is like AGI, it's not AGI. And I think all those things kind of riddled with like, well, what's the definition of AGI, right?
If you took ChatGPT 200 years back in time, right, in England, and you showed someone this magical thing you've built, you'd be burned at the stake. If you did it 50 years ago, people would definitely think, this is AGI. I'm talking to a computer that knows everything about the world. This is absolutely batshit, right? And then today, people are like, ah, it's just a stochastic parrot.
If you took ChatGPT 200 years back in time, right, in England, and you showed someone this magical thing you've built, you'd be burned at the stake. If you did it 50 years ago, people would definitely think, this is AGI. I'm talking to a computer that knows everything about the world. This is absolutely batshit, right? And then today, people are like, ah, it's just a stochastic parrot.
If you took ChatGPT 200 years back in time, right, in England, and you showed someone this magical thing you've built, you'd be burned at the stake. If you did it 50 years ago, people would definitely think, this is AGI. I'm talking to a computer that knows everything about the world. This is absolutely batshit, right? And then today, people are like, ah, it's just a stochastic parrot.
And I don't know what's true or what's not true there. I think it'll be very powerful. I don't think it'll have as much as kind of like the cultural moment that ChatGPT created initially. And as I said earlier, I think elements of this that there'll be like a small group of people who really care deeply about what benchmark does it work against?
And I don't know what's true or what's not true there. I think it'll be very powerful. I don't think it'll have as much as kind of like the cultural moment that ChatGPT created initially. And as I said earlier, I think elements of this that there'll be like a small group of people who really care deeply about what benchmark does it work against?
And I don't know what's true or what's not true there. I think it'll be very powerful. I don't think it'll have as much as kind of like the cultural moment that ChatGPT created initially. And as I said earlier, I think elements of this that there'll be like a small group of people who really care deeply about what benchmark does it work against?