Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing

Eiso Kant

👤 Person
612 total appearances

Appearances Over Time

Podcast Appearances

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

Now, our point of view is that that world is still quite a bit out and that we are actually going to end up in a place before that where we see human level capabilities in areas that are massively economically valuable and can drive abundance in the world for all of us that are not going to be equally distributed, not for every single thing.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

And what I mean by that is that if you think about foundation models today, and I have a kind of simple mental model about them, which is that we are taking large web skill data and we're compressing it into a neural net and we're forcing generalization and learning. And this has led to things like incredible language understanding in these models.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

And what I mean by that is that if you think about foundation models today, and I have a kind of simple mental model about them, which is that we are taking large web skill data and we're compressing it into a neural net and we're forcing generalization and learning. And this has led to things like incredible language understanding in these models.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

And what I mean by that is that if you think about foundation models today, and I have a kind of simple mental model about them, which is that we are taking large web skill data and we're compressing it into a neural net and we're forcing generalization and learning. And this has led to things like incredible language understanding in these models.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

But it's also led to things where we look at and we say, these models are kind of dumb. Why aren't they able to do X, Y, or Z? And our point of view is that the reason why they're not able to do X, Y, or Z has to do with how they learn. The most important part, I think, of what I said is the scale of data. When we have web-scale data, we can get language understanding.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

But it's also led to things where we look at and we say, these models are kind of dumb. Why aren't they able to do X, Y, or Z? And our point of view is that the reason why they're not able to do X, Y, or Z has to do with how they learn. The most important part, I think, of what I said is the scale of data. When we have web-scale data, we can get language understanding.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

But it's also led to things where we look at and we say, these models are kind of dumb. Why aren't they able to do X, Y, or Z? And our point of view is that the reason why they're not able to do X, Y, or Z has to do with how they learn. The most important part, I think, of what I said is the scale of data. When we have web-scale data, we can get language understanding.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

But when we have areas where we have very little data, models really struggle to learn truly more capable areas. And I mean improvements in reasoning, improvements in planning capabilities, improvement in deep understanding of things. While as humans we don't require so much data, the way to think about models is that they require magnitudes order more data to learn the same thing.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

But when we have areas where we have very little data, models really struggle to learn truly more capable areas. And I mean improvements in reasoning, improvements in planning capabilities, improvement in deep understanding of things. While as humans we don't require so much data, the way to think about models is that they require magnitudes order more data to learn the same thing.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

But when we have areas where we have very little data, models really struggle to learn truly more capable areas. And I mean improvements in reasoning, improvements in planning capabilities, improvement in deep understanding of things. While as humans we don't require so much data, the way to think about models is that they require magnitudes order more data to learn the same thing.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

Our focus is on software development and coding, and it's for a very specific reason. The world has already generated an incredibly large data set of code. To put a little bit into context, like usable code for training, what we refer to as about 3 trillion tokens.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

Our focus is on software development and coding, and it's for a very specific reason. The world has already generated an incredibly large data set of code. To put a little bit into context, like usable code for training, what we refer to as about 3 trillion tokens.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

Our focus is on software development and coding, and it's for a very specific reason. The world has already generated an incredibly large data set of code. To put a little bit into context, like usable code for training, what we refer to as about 3 trillion tokens.

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

And if you look at kind of usable language in English on the internet for training, we're talking about anywhere between 10 and 15 trillion tokens. There's a massive amount of code that the world has developed. Over 400 million code bases are publicly on the internet. So why don't we have this incredible AI that's able to already do everything in coding?

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

And if you look at kind of usable language in English on the internet for training, we're talking about anywhere between 10 and 15 trillion tokens. There's a massive amount of code that the world has developed. Over 400 million code bases are publicly on the internet. So why don't we have this incredible AI that's able to already do everything in coding?

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

And if you look at kind of usable language in English on the internet for training, we're talking about anywhere between 10 and 15 trillion tokens. There's a massive amount of code that the world has developed. Over 400 million code bases are publicly on the internet. So why don't we have this incredible AI that's able to already do everything in coding?

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

It's because coding is not just about the output of the work. The code that we have online represents the final product, but it doesn't represent all of the thinking and actions that we took to get there. And that's the missing data set. The missing data set in the world to go from where models are today to being as capable as humans at building software

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

It's because coding is not just about the output of the work. The code that we have online represents the final product, but it doesn't represent all of the thinking and actions that we took to get there. And that's the missing data set. The missing data set in the world to go from where models are today to being as capable as humans at building software

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

It's because coding is not just about the output of the work. The code that we have online represents the final product, but it doesn't represent all of the thinking and actions that we took to get there. And that's the missing data set. The missing data set in the world to go from where models are today to being as capable as humans at building software

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

is the data set that represents being given the task, all of your intermediate reasoning and thinking, the steps that you do, the code that you write and try to run, and then it fills and you learn from those interactions, and all the way to kind of getting that final product. And that intermediate data set, that's what Poolside exists on creating.