Daniel Dines
👤 PersonAppearances Over Time
Podcast Appearances
This is a story that I never told anyone. I've wasted my late 20s, a big part of my 30s and 40s thinking in this way. It's a totally waste of cycles and energy, man. I am a lonely wolf. I find life pretty lonely, man.
This is a story that I never told anyone. I've wasted my late 20s, a big part of my 30s and 40s thinking in this way. It's a totally waste of cycles and energy, man. I am a lonely wolf. I find life pretty lonely, man.
This is a story that I never told anyone. I've wasted my late 20s, a big part of my 30s and 40s thinking in this way. It's a totally waste of cycles and energy, man. I am a lonely wolf. I find life pretty lonely, man.
Well, I invited you basically for this podcast, so thank you for having me.
Well, I invited you basically for this podcast, so thank you for having me.
Well, I invited you basically for this podcast, so thank you for having me.
Yeah, actually, I was thinking a lot lately, what's our story within the AI narrative, where we can really bring a lot of value. So over the last two years, we've spent a lot of time really trying to fine tune LLMs, build around them, and to a certain degree of success. But I've got really inspired by stories like Cursor AI, and my development team loves that product.
Yeah, actually, I was thinking a lot lately, what's our story within the AI narrative, where we can really bring a lot of value. So over the last two years, we've spent a lot of time really trying to fine tune LLMs, build around them, and to a certain degree of success. But I've got really inspired by stories like Cursor AI, and my development team loves that product.
Yeah, actually, I was thinking a lot lately, what's our story within the AI narrative, where we can really bring a lot of value. So over the last two years, we've spent a lot of time really trying to fine tune LLMs, build around them, and to a certain degree of success. But I've got really inspired by stories like Cursor AI, and my development team loves that product.
It's a beautiful product built on the top of multiple LLMs, but it just works. I recall actually how we started UiPath. Maybe this is a story that I never told anyone. In the beginning, we were always based on AI, but we were using a library called OpenCV. which, among many other things, provided a really cool feature to find a smaller image within a bigger image.
It's a beautiful product built on the top of multiple LLMs, but it just works. I recall actually how we started UiPath. Maybe this is a story that I never told anyone. In the beginning, we were always based on AI, but we were using a library called OpenCV. which, among many other things, provided a really cool feature to find a smaller image within a bigger image.
It's a beautiful product built on the top of multiple LLMs, but it just works. I recall actually how we started UiPath. Maybe this is a story that I never told anyone. In the beginning, we were always based on AI, but we were using a library called OpenCV. which, among many other things, provided a really cool feature to find a smaller image within a bigger image.
And we repurposed that library for the sake of automation. So we can take a screenshot of an application, and if you want to click on a button, we can take an image of that button and then find it during replay, just call, you know, function, find this image of the button, I will get the coordinates and I will click on the button. But that was not the only thing that we did.
And we repurposed that library for the sake of automation. So we can take a screenshot of an application, and if you want to click on a button, we can take an image of that button and then find it during replay, just call, you know, function, find this image of the button, I will get the coordinates and I will click on the button. But that was not the only thing that we did.
And we repurposed that library for the sake of automation. So we can take a screenshot of an application, and if you want to click on a button, we can take an image of that button and then find it during replay, just call, you know, function, find this image of the button, I will get the coordinates and I will click on the button. But that was not the only thing that we did.
We created, I think, a magical experience. So we let someone to record a flow on the screen. Just show, I need to click this button, indicate on the screen the button, and then we will generate like a very simple statement, like click that button with the image. Everything was stored. So you could have record an entire flow based only on working with images, even typing in an edit box.
We created, I think, a magical experience. So we let someone to record a flow on the screen. Just show, I need to click this button, indicate on the screen the button, and then we will generate like a very simple statement, like click that button with the image. Everything was stored. So you could have record an entire flow based only on working with images, even typing in an edit box.
We created, I think, a magical experience. So we let someone to record a flow on the screen. Just show, I need to click this button, indicate on the screen the button, and then we will generate like a very simple statement, like click that button with the image. Everything was stored. So you could have record an entire flow based only on working with images, even typing in an edit box.
So we will capture the edit box image and then like a label and we will find them during runtime. But from the perspective of the user was really simple. So I remember it was like 2013 when I showed this product to some guys that were really Blue Prism experts.
So we will capture the edit box image and then like a label and we will find them during runtime. But from the perspective of the user was really simple. So I remember it was like 2013 when I showed this product to some guys that were really Blue Prism experts.