Edward Gibson
๐ค SpeakerAppearances Over Time
Podcast Appearances
I'm just saying the error there is like, if I explain to you there's 100% chance that the car is behind this case, this door, well, do you want to trade? People say no. But this thing will say yes, because it's so, that trick, it's so wound up on the form that it's, that's an error that a human doesn't make, which is kind of interesting.
I'm just saying the error there is like, if I explain to you there's 100% chance that the car is behind this case, this door, well, do you want to trade? People say no. But this thing will say yes, because it's so, that trick, it's so wound up on the form that it's, that's an error that a human doesn't make, which is kind of interesting.
Yeah.
Yeah.
Yeah.
Look, the places where large language models are, the form is amazing. So let's go back to nested structures, center-embedded structures, okay? If you ask a human to complete those, they can't do it. Neither can a large language model. They're just like humans in that. If you ask, if I ask a large language model- That's fascinating, by the way. The central embedding?
Look, the places where large language models are, the form is amazing. So let's go back to nested structures, center-embedded structures, okay? If you ask a human to complete those, they can't do it. Neither can a large language model. They're just like humans in that. If you ask, if I ask a large language model- That's fascinating, by the way. The central embedding?
Look, the places where large language models are, the form is amazing. So let's go back to nested structures, center-embedded structures, okay? If you ask a human to complete those, they can't do it. Neither can a large language model. They're just like humans in that. If you ask, if I ask a large language model- That's fascinating, by the way. The central embedding?
Just like humans. Exactly like humans. Exactly the same way as humans. And that's not trained. So that is a similarity. But that's not meaning. This is form. But when we get into meaning, this is where they get kind of messed up. When you start just saying, oh, what's behind this door? Oh, this is the thing I want. Humans don't mess that up as much. The form, it's just like.
Just like humans. Exactly like humans. Exactly the same way as humans. And that's not trained. So that is a similarity. But that's not meaning. This is form. But when we get into meaning, this is where they get kind of messed up. When you start just saying, oh, what's behind this door? Oh, this is the thing I want. Humans don't mess that up as much. The form, it's just like.
Just like humans. Exactly like humans. Exactly the same way as humans. And that's not trained. So that is a similarity. But that's not meaning. This is form. But when we get into meaning, this is where they get kind of messed up. When you start just saying, oh, what's behind this door? Oh, this is the thing I want. Humans don't mess that up as much. The form, it's just like.
The form of the match is amazing. without being trained to do that. I mean, it's trained in the sense that it's getting lots of data, which is just like human data, but it's not being trained on, you know, bad sentences and being told what's bad. It just can't do those. It'll actually say things like, those are too hard for me to complete or something, which is kind of interesting, actually.
The form of the match is amazing. without being trained to do that. I mean, it's trained in the sense that it's getting lots of data, which is just like human data, but it's not being trained on, you know, bad sentences and being told what's bad. It just can't do those. It'll actually say things like, those are too hard for me to complete or something, which is kind of interesting, actually.
The form of the match is amazing. without being trained to do that. I mean, it's trained in the sense that it's getting lots of data, which is just like human data, but it's not being trained on, you know, bad sentences and being told what's bad. It just can't do those. It'll actually say things like, those are too hard for me to complete or something, which is kind of interesting, actually.
Kind of, how does it know that? I don't know.
Kind of, how does it know that? I don't know.
Kind of, how does it know that? I don't know.
I think so. Yeah, I think so.
I think so. Yeah, I think so.
I think so. Yeah, I think so.