Dwarkesh Patel
๐ค SpeakerAppearances Over Time
Podcast Appearances
Yeah, yeah.
This is a question I should have asked earlier.
So we were talking about how currently it feels like when you're doing AI engineering or AI research, these models are more like in the category of compiler rather than in the category of a replacement.
At some point, if you have quote-unquote AGI, it should be able to do what you do.
And do you feel like having a million copies of You in Parallel results in some huge speed-up of AI progress?
Basically, if that does happen, do you expect to see an intelligence explosion?
Or even once we have a true HA, I'm not talking about LLMs today, but real HA.
You think it's continuous with this hyper-exponential trend?
Are you saying that what will happen is, so if you look at the trend before the Industrial Revolution to currently, you have a hyper-exponential where you go from 0% growth to then 10,000 years ago, 0.02% growth, and then currently we're at 2% growth.
So that's a hyper-exponential, and you're saying if you're charting AI on there, then it's like AI takes you to 20% growth or 200% growth.
Mm-hmm.
Or you could be saying, if you look at the last 300 years, what you've been seeing is you have technology after technology, computers, electrification, steam, steam engines, railways, et cetera.
But the rate of growth is the exact same.
It's 2%.
So are you saying the rate of growth will... No, I basically, I expect the rate of growth has also stayed roughly constant, right?
For only the last 200, 300 years.
But over the course of human history, it's like exploded, right?
It's like gone from like 0% basically to like faster, faster, faster, industrial explosion, 2%.
But just to clarify, you're saying that the rate of growth will not change.