Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Michael Truell

๐Ÿ‘ค Speaker
255 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

And you zap that with tons and tons and tons of compute, and you're willing to put in $50 to solve that bug or something even more.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

There is also a potential world where there's a technical solution to this like honor system problem too, where if we can get to a place where we understand the output of the system more, I mean, to the stuff we were talking about with like, you know, error checking with the LSP and then also running the code.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

There is also a potential world where there's a technical solution to this like honor system problem too, where if we can get to a place where we understand the output of the system more, I mean, to the stuff we were talking about with like, you know, error checking with the LSP and then also running the code.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

There is also a potential world where there's a technical solution to this like honor system problem too, where if we can get to a place where we understand the output of the system more, I mean, to the stuff we were talking about with like, you know, error checking with the LSP and then also running the code.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

But if you could get to a place where you could actually somehow verify, oh, I have fixed the bug, maybe then the bounty system doesn't need to rely on the honor system too.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

But if you could get to a place where you could actually somehow verify, oh, I have fixed the bug, maybe then the bounty system doesn't need to rely on the honor system too.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

But if you could get to a place where you could actually somehow verify, oh, I have fixed the bug, maybe then the bounty system doesn't need to rely on the honor system too.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

Yeah, I think that it has been an interesting journey adding each extra zero to the request per second. You run into all of these with the general components you're using for caching and databases run into issues as you make things bigger and bigger. And now we're at the scale where we get int overflows on our tables and things like that.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

Yeah, I think that it has been an interesting journey adding each extra zero to the request per second. You run into all of these with the general components you're using for caching and databases run into issues as you make things bigger and bigger. And now we're at the scale where we get int overflows on our tables and things like that.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

Yeah, I think that it has been an interesting journey adding each extra zero to the request per second. You run into all of these with the general components you're using for caching and databases run into issues as you make things bigger and bigger. And now we're at the scale where we get int overflows on our tables and things like that.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

And then also, there have been some custom systems that we've built, like, for instance, our retrieval system for computing a semantic index of your codebase and answering questions about a codebase that have continually, I feel like, been one of the trickier things to scale.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

And then also, there have been some custom systems that we've built, like, for instance, our retrieval system for computing a semantic index of your codebase and answering questions about a codebase that have continually, I feel like, been one of the trickier things to scale.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

And then also, there have been some custom systems that we've built, like, for instance, our retrieval system for computing a semantic index of your codebase and answering questions about a codebase that have continually, I feel like, been one of the trickier things to scale.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

It's tricky. I think we can do a lot better at computing the context automatically in the future. One thing that's important to note is there are trade-offs with including automatic context.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

It's tricky. I think we can do a lot better at computing the context automatically in the future. One thing that's important to note is there are trade-offs with including automatic context.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

It's tricky. I think we can do a lot better at computing the context automatically in the future. One thing that's important to note is there are trade-offs with including automatic context.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

So the more context you include for these models, first of all, the slower they are and the more expensive those requests are, which means you can then do less model calls and do less fancy stuff in the background. Also, for a lot of these models, they get confused if you have a lot of information in the prompt.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

So the more context you include for these models, first of all, the slower they are and the more expensive those requests are, which means you can then do less model calls and do less fancy stuff in the background. Also, for a lot of these models, they get confused if you have a lot of information in the prompt.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

So the more context you include for these models, first of all, the slower they are and the more expensive those requests are, which means you can then do less model calls and do less fancy stuff in the background. Also, for a lot of these models, they get confused if you have a lot of information in the prompt.

Lex Fridman Podcast
#447 โ€“ Cursor Team: Future of Programming with AI

So the bar for accuracy and for relevance of the context you include should be quite high. But already we do some automatic context in some places within the product. It's definitely something we want to get a lot better at. And I think that there are a lot of cool ideas to try there, both on the learning better retrieval systems, like better embedding models, better re-rankers.