Rob Wiblin
๐ค SpeakerAppearances Over Time
Podcast Appearances
And then I think I have a lot of uncertainty about how it evolves from there.
I think it depends a lot on the competition landscape they face.
So basically, if the other companies are really far behind, then I think there's a pretty strong incentive and reason to keep your capabilities secret because you give up like sort of quarterly profits, but maybe you don't care about that because you're running on investment money anyway.
And if you can...
get your AI to help you make better AI to help you make better AI and so on, you could emerge with like superintelligence that might give you a power that rivals nation states or like the ability to just like decisively control how the future goes.
And that might be like very attractive to a sort of power seeking company.
I do think it does involve foregoing short-term profits, though, which means that if competitors are close at your heels and your investors are breathing down your neck to deliver quarterly earnings, it'll be hard.
Well, and then also, your plan is to screw over the investors in this case.
Your plan is to create a superintelligence, not to pay them back.
So create a superintelligence and take over the world, maybe.
They won't like that.
There's a mismatch in incentives between the investors and the CEO, and the CEO is sort of being a bad agent to their principal.
So basically like the more things look like an efficient competitive market with very little slack, the more the leading company will be sort of forced to provide access to the rest of us.
Yeah, I think it's unclear.
I think there are, certainly they have some incentive to be into this, but the two sort of alternative uses of AI labor that might be more attractive to them are like one, power seeking for themselves, just like,
Building up an enormous AI lead over everyone else and then sort of bursting onto the scene with an incredible amount of power and the ability to challenge the US government or nation states might be attractive to some people.
I think that would be a very evil strategy to pursue, but it's definitely in the water.
The other thing is more mundane.
It's just...
using these AIs to make normal goods and services, to make the products and the media content and the other services that people most want to pay money for in a short-term sense.