Andy Halliday
๐ค SpeakerAppearances Over Time
Podcast Appearances
And they may ultimately, I mean, the AIs may emerge with something akin to flash intuition that humans are capable of.
But then these researchers at the Wharton School say the following.
I'm going to read from the abstract here, but they're introducing a third kind of thinking that's happening because of the advent of AI.
Okay, so here it is.
People increasingly consult generative AI while reasoning.
As AI becomes embedded in the daily thought process, what happens to human judgment?
To your point, we introduce tri-system theory, extending dual accounts, dual process accounts of reasoning, that's system one and system two, by positing system three, artificial cognition that operates outside the brain.
System 3 can supplement or supplant internal processes, introducing novel cognitive pathways.
And I experienced this.
I now am highly reliant on the sort of outboard contributions of AI to my thinking and approach to problem solving.
Okay, so the issue that arises out of this analysis that they've done is that there's a thing that they're identifying called cognitive surrender, which is where humans having too much reliance on the AI and depending on its veracity and reasoning ability,
end up not, you know, not using their brain well enough.
Right.
So anyway, very interesting paper.
It's called Thinking Fast, Slow and Artificial.
How AI is reshaping human reasoning and this idea of the rise of cognitive surrender by humans.
Not for me, no.
And the model advancements don't stop.
There's a bunch of news around that, but I want to just talk about Google Gemini 3.0 DeepThink.