Andy Halliday
๐ค SpeakerAppearances Over Time
Podcast Appearances
And is the knowledge in those helpful?
It doesn't have that mechanism.
And so the, it's the, it's the protocol that skills enables the,
with the agent that's doing the work for you on your computer or in a coding vibe coding session.
It's the, the skill is just this thing that it can call upon to help itself.
And I don't think there's that level of communication and custom GPT.
You basically have to go to the custom GPT or invoke it with the app in the course of a chat.
And then it will,
take on board the knowledge that's built into that custom GPT and the instructions and understand that process.
So there is a similarity there.
I do get that, but I think of them as very different in terms of their fundamental architecture.
Here's another little thing that the skills framework enables, which is a skill can be updated just like a custom GPT could be updated and improved.
But a skill can be updated by the process of using the skill.
There's a way to instrument your Claude instance on your machine so that every time the skill is used, if in the course of using that skill, there's something else learned, you can direct it to recompute the skill and revise it to incorporate what it learned in using that skill.
Right.
That's different.
One last thing and then I think we can wrap.
So at Davos, at the World Economic Forum, OpenAI's Chris Lehane revealed that the company is on track to unveil its Johnny Ive-led physical AI device in the second half of this year.
So later this year, we'll see that device.
That's one of the form factors that has been posited for the Johnny I've designed one.