Michael Truell
๐ค SpeakerAppearances Over Time
Podcast Appearances
And so traditionally, that's what a code editor has meant. And I think that what a code editor is is going to change a lot over the next 10 years as what it means to build software maybe starts to look a bit different. I think also a code editor should just be fun.
And so traditionally, that's what a code editor has meant. And I think that what a code editor is is going to change a lot over the next 10 years as what it means to build software maybe starts to look a bit different. I think also a code editor should just be fun.
Like fundamentally, I think one of the things that draws a lot of people to building stuff on computers is this like insane integration speed where, you know, in other disciplines, you might be sort of gatecapped by resources or the ability, even the ability, you know, to get a large group together and coding is this like amazing thing where it's you and the computer and that alone, you can build really cool stuff really quickly.
Like fundamentally, I think one of the things that draws a lot of people to building stuff on computers is this like insane integration speed where, you know, in other disciplines, you might be sort of gatecapped by resources or the ability, even the ability, you know, to get a large group together and coding is this like amazing thing where it's you and the computer and that alone, you can build really cool stuff really quickly.
Like fundamentally, I think one of the things that draws a lot of people to building stuff on computers is this like insane integration speed where, you know, in other disciplines, you might be sort of gatecapped by resources or the ability, even the ability, you know, to get a large group together and coding is this like amazing thing where it's you and the computer and that alone, you can build really cool stuff really quickly.
Okay. So what's the origin story of Cursor? So around 2020, the scaling loss papers came out from OpenAI. And that was a moment where this looked like clear, predictable progress for the field, where even if we didn't have any more ideas, it looked like you could make these models a lot better if you had more compute and more data.
Okay. So what's the origin story of Cursor? So around 2020, the scaling loss papers came out from OpenAI. And that was a moment where this looked like clear, predictable progress for the field, where even if we didn't have any more ideas, it looked like you could make these models a lot better if you had more compute and more data.
Okay. So what's the origin story of Cursor? So around 2020, the scaling loss papers came out from OpenAI. And that was a moment where this looked like clear, predictable progress for the field, where even if we didn't have any more ideas, it looked like you could make these models a lot better if you had more compute and more data.
So around that time, for some of us, there were a lot of conceptual conversations about what's this going to look like? What's the story going to be for all these different knowledge worker fields about how they're going to be made better by this technology getting better?
So around that time, for some of us, there were a lot of conceptual conversations about what's this going to look like? What's the story going to be for all these different knowledge worker fields about how they're going to be made better by this technology getting better?
So around that time, for some of us, there were a lot of conceptual conversations about what's this going to look like? What's the story going to be for all these different knowledge worker fields about how they're going to be made better by this technology getting better?
And then I think there were a couple of moments where the theoretical gains predicted in that paper started to feel really concrete. And it started to feel like a moment where you could actually go and not do a PhD if you wanted to work on, do useful work in AI. Actually felt like now there was this whole set of systems one could build that were really useful.
And then I think there were a couple of moments where the theoretical gains predicted in that paper started to feel really concrete. And it started to feel like a moment where you could actually go and not do a PhD if you wanted to work on, do useful work in AI. Actually felt like now there was this whole set of systems one could build that were really useful.
And then I think there were a couple of moments where the theoretical gains predicted in that paper started to feel really concrete. And it started to feel like a moment where you could actually go and not do a PhD if you wanted to work on, do useful work in AI. Actually felt like now there was this whole set of systems one could build that were really useful.
And I think that the first moment we already talked about a little bit, which was playing with the early bit of Copilot, that was awesome and magical. I think that the next big moment where everything kind of clicked together was actually getting early access to GPT-4. So it was sort of end of 2022 was when we were tinkering with that model. And the step up in capabilities felt enormous.
And I think that the first moment we already talked about a little bit, which was playing with the early bit of Copilot, that was awesome and magical. I think that the next big moment where everything kind of clicked together was actually getting early access to GPT-4. So it was sort of end of 2022 was when we were tinkering with that model. And the step up in capabilities felt enormous.
And I think that the first moment we already talked about a little bit, which was playing with the early bit of Copilot, that was awesome and magical. I think that the next big moment where everything kind of clicked together was actually getting early access to GPT-4. So it was sort of end of 2022 was when we were tinkering with that model. And the step up in capabilities felt enormous.
And previous to that, we had been working on a couple of different projects. We had been because of Copilot, because of scaling Oz, because of our prior interest in the technology, we had been tinkering around with tools for programmers, but things that are like very specific.
And previous to that, we had been working on a couple of different projects. We had been because of Copilot, because of scaling Oz, because of our prior interest in the technology, we had been tinkering around with tools for programmers, but things that are like very specific.
And previous to that, we had been working on a couple of different projects. We had been because of Copilot, because of scaling Oz, because of our prior interest in the technology, we had been tinkering around with tools for programmers, but things that are like very specific.