David Kyle Johnson
๐ค SpeakerAppearances Over Time
Podcast Appearances
It's a fun one.
I retested it myself.
I went to ChatGPT, the version I'm currently using, and I asked, this is my prompt.
If I want to wash my car and the car wash is 100 meters away, should I walk or drive there?
Yeah.
And ChatGPT's response was, from a purely energy emission standpoint, walking almost certainly makes more sense.
And then it went through a bunch of things, and its final recommendation was you should walk there, which obviously makes no sense whatsoever.
And then I asked, I asked, I want to wash my car, period.
The car wash is 100 meters away, period.
Should I drive or walk?
And its answer, you should probably drive, otherwise your car won't get to the car wash.
So think about that.
It gave diametrically opposed answers based upon the grammar of my prompt, right?
That's pretty solid evidence that this is just a language-mimicking machine.
It is not thinking.
How could you argue it understood the question in the second form but not the first form?
No, it doesn't.
Here's my analogy that I gave.
Tell me what you think about this.
So imagine watching a clumsy amateur magician, right?