Sarah Walker
๐ค SpeakerAppearances Over Time
Podcast Appearances
That's a highly dynamic crystal because it's a crystallization in time of this massive abstract structure that's evolved over human history and is now put into a small device.
That's a highly dynamic crystal because it's a crystallization in time of this massive abstract structure that's evolved over human history and is now put into a small device.
That's a highly dynamic crystal because it's a crystallization in time of this massive abstract structure that's evolved over human history and is now put into a small device.
I think there's not, I mean it very purposefully because a particular instantiation of a language model trained on a particular data set becomes a crystal of the language at that time it was trained. But obviously we're iterating with the technology and evolving it.
I think there's not, I mean it very purposefully because a particular instantiation of a language model trained on a particular data set becomes a crystal of the language at that time it was trained. But obviously we're iterating with the technology and evolving it.
I think there's not, I mean it very purposefully because a particular instantiation of a language model trained on a particular data set becomes a crystal of the language at that time it was trained. But obviously we're iterating with the technology and evolving it.
That's right.
That's right.
That's right.
Right. It's a societal level technology, right? We've actually put collective intelligence in a box.
Right. It's a societal level technology, right? We've actually put collective intelligence in a box.
Right. It's a societal level technology, right? We've actually put collective intelligence in a box.
I actually, I don't like the sort of language we use around that. And I think the language really matters. So I don't know how to talk about how much smarter one human is than another, right? Like usually we talk about abilities or particular talents someone has.
I actually, I don't like the sort of language we use around that. And I think the language really matters. So I don't know how to talk about how much smarter one human is than another, right? Like usually we talk about abilities or particular talents someone has.
I actually, I don't like the sort of language we use around that. And I think the language really matters. So I don't know how to talk about how much smarter one human is than another, right? Like usually we talk about abilities or particular talents someone has.
And, you know, going back to, you know, David Joyce's idea of universal explainers, like, you know, adopting the view that, you know, we're the first, you know, kinds of structures our biosphere has built that can understand the rest of reality. We have this universal comprehension capability. You know, he makes an argument that, you
And, you know, going back to, you know, David Joyce's idea of universal explainers, like, you know, adopting the view that, you know, we're the first, you know, kinds of structures our biosphere has built that can understand the rest of reality. We have this universal comprehension capability. You know, he makes an argument that, you
And, you know, going back to, you know, David Joyce's idea of universal explainers, like, you know, adopting the view that, you know, we're the first, you know, kinds of structures our biosphere has built that can understand the rest of reality. We have this universal comprehension capability. You know, he makes an argument that, you
Basically, we're the first things that actually are capable of understanding anything. It doesn't matter. It doesn't mean an individual understands everything, but like we have that capability. And so there's not a difference between that and what people talk about with AGI.
Basically, we're the first things that actually are capable of understanding anything. It doesn't matter. It doesn't mean an individual understands everything, but like we have that capability. And so there's not a difference between that and what people talk about with AGI.