Panos Panay
👤 SpeakerAppearances Over Time
Podcast Appearances
It's hard though, you know, you've been training people for, we've been training ourselves for 10 years. You know, calling a timer is, can you set a timer for eight minutes? Calling a timer on the new Alexa is, I'm making a ramen egg. gotcha, I'll set a timer for eight minutes. Like where she just proactively comes back and sets it.
It's hard though, you know, you've been training people for, we've been training ourselves for 10 years. You know, calling a timer is, can you set a timer for eight minutes? Calling a timer on the new Alexa is, I'm making a ramen egg. gotcha, I'll set a timer for eight minutes. Like where she just proactively comes back and sets it.
I didn't demo that yesterday because I didn't want the timer headline, but it's really badass experience. You know, it's really cool. And so there's a level of like that transformation where I'm off topic, let me go back. At the end of the day, the LLM needs to be able to now, you know, it's the base layer. Then you got the next layer, which is just a series of different models.
I didn't demo that yesterday because I didn't want the timer headline, but it's really badass experience. You know, it's really cool. And so there's a level of like that transformation where I'm off topic, let me go back. At the end of the day, the LLM needs to be able to now, you know, it's the base layer. Then you got the next layer, which is just a series of different models.
I didn't demo that yesterday because I didn't want the timer headline, but it's really badass experience. You know, it's really cool. And so there's a level of like that transformation where I'm off topic, let me go back. At the end of the day, the LLM needs to be able to now, you know, it's the base layer. Then you got the next layer, which is just a series of different models.
picking the right model to do the job. And then that model is basically picking the right expert. And so the LLM plays a role, especially in the natural side of it. But as it makes it through the stack, it narrows down for accuracy, it narrows down for speed, it then narrows down for holding memory and personalizing it.
picking the right model to do the job. And then that model is basically picking the right expert. And so the LLM plays a role, especially in the natural side of it. But as it makes it through the stack, it narrows down for accuracy, it narrows down for speed, it then narrows down for holding memory and personalizing it.
picking the right model to do the job. And then that model is basically picking the right expert. And so the LLM plays a role, especially in the natural side of it. But as it makes it through the stack, it narrows down for accuracy, it narrows down for speed, it then narrows down for holding memory and personalizing it.
And now you just have a series of experts basically sitting on top, and one of them is conversational. That's not just an LLM. That's a series of – and by the way, if you look at any one of these other products, they're not just LLMs. They're basically – they're mainly, I don't know, overstating it, understating it, so not to be rude, but they're chatbots.
And now you just have a series of experts basically sitting on top, and one of them is conversational. That's not just an LLM. That's a series of – and by the way, if you look at any one of these other products, they're not just LLMs. They're basically – they're mainly, I don't know, overstating it, understating it, so not to be rude, but they're chatbots.
And now you just have a series of experts basically sitting on top, and one of them is conversational. That's not just an LLM. That's a series of – and by the way, if you look at any one of these other products, they're not just LLMs. They're basically – they're mainly, I don't know, overstating it, understating it, so not to be rude, but they're chatbots.
And they're pretty good. They're damn good. And then when you start typing, you know, long form and rewriting and dropping in summaries, very powerful. Yeah. creating videos, creating photos, isolated but powerful. But the idea that these experts all sit on top of the stack and basically there's a runtime that orchestrates and says, okay, call these experts.
And they're pretty good. They're damn good. And then when you start typing, you know, long form and rewriting and dropping in summaries, very powerful. Yeah. creating videos, creating photos, isolated but powerful. But the idea that these experts all sit on top of the stack and basically there's a runtime that orchestrates and says, okay, call these experts.
And they're pretty good. They're damn good. And then when you start typing, you know, long form and rewriting and dropping in summaries, very powerful. Yeah. creating videos, creating photos, isolated but powerful. But the idea that these experts all sit on top of the stack and basically there's a runtime that orchestrates and says, okay, call these experts.
These two experts have to work together. Got it. And then it operates. That's just not simple. You know, the first thing I was asked when I got there was, Hey, why don't you just change the brain with an LLM and everything will be fine?
These two experts have to work together. Got it. And then it operates. That's just not simple. You know, the first thing I was asked when I got there was, Hey, why don't you just change the brain with an LLM and everything will be fine?
These two experts have to work together. Got it. And then it operates. That's just not simple. You know, the first thing I was asked when I got there was, Hey, why don't you just change the brain with an LLM and everything will be fine?
You might have.
You might have.
You might have.