Corey Knowles
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
But today, it's 2025, and I think what's really accelerated over the last year or two
is now even without taking that ComSci class, you can probably give the LLM to write you a bit of SQL.
And you can press run and you're like, oh, the results don't look quite right.
Let me actually add a column to this.
You tell the LLM that and you press run and it looks right.
And so that I think is a really pretty big development over the last year or two.
Yeah.
So...
I think what's really remarkable over the last 12 to 18 months is first of all, how fast AI and LMS have been moving.
But today I would say as of, what day is it?
September of 2025, most of these platforms, especially those targeting non-engineers
are not really geared towards building software on top of your data that a company would actually feel confident in using.
So if you, for example, look at Figma Make, you look at Bulb, Lovable, Replit, all these are really good at getting a first version of the app that you want.
But it's actually very hard to deploy that in your own AWS, for example.
It's very hard to connect that to your own Salesforce, for example, or your own Workday.
And so to do so, you actually can prototype very quickly and let's say figure what to make, but then you still need to get an engineer actually to go connect these systems to your data, go deploy it to production, write tests for it, et cetera.
And so what's really unique about Retool is I think we're the world's first platform where people can vibe code directly on top of Salesforce, directly on top of your own Postgres database and deploy it in your AWS, in your Azure, in your GCP.
That's, I think, why enterprises have been really flocking to Retool is because there's this confluence of the technology of the LM is so cool for vibing, if you will.
Then you have the enterprise guardrails of, I guarantee that it's going to be secure, that it's going to be reliable, that it's going to be localized, for example.
And these are things that actually LMs are not so good at, actually.