Palmer Luckey
๐ค SpeakerAppearances Over Time
Podcast Appearances
Let's assume that AGI just exists in the more likely sense.
And I think for a while, it's likely to remain the domain of the people that you would want to have it.
I think the better way to look at it is, yes, AGI could be used to engineer really terrifying bioweapons.
But that's already the case today.
There's already really good bioweapons you can make without AI.
We struggle to defend against things like that.
I think AI would be very helpful in being able to defend against
In other words, if AGI is so good that it can come up with anthrax too, and it's just this incredibly powerful thing, I would hope that AGI can also come up with the thing that immunizes people against it or that builds nanobots that are in our bloodstream at all times that are able to, on the fly, compute exactly what they're going to do to stop any biological threat, natural or man-made.
It's a case where the defensive uses like that are actually going to outweigh the offensive uses in the long run.
The thing to worry about is you have, let's say, North Korea or Iran figures out how to do it with a relatively small number of people.
They have the brilliant insight that nobody else does, and then they use it to create super weapons.
But at the end of the day, I think that actually comes down less to an AI problem and more of a people problem.
I'm way more scared of really evil people with existing weapons than the possibility that they get slightly better weapons.
If they were to do something, the only thing stopping some of these nations is the threat of massive retaliation.
I'm not sure that a nuclear-equipped Iran is less spooky than a AI bioweapon-equipped Iran, mostly because the way you stop them from deploying either of those capabilities isn't actually tied to the technology itself.
It's tied to everything around it.
I'm gonna split it into two areas.
There are a lot of people who have very directly given me access to the resources I have today, and where it's very clear that without them believing in me, that I would not be anywhere.
This is one of the reasons I like Peter so much.
Founders Fund was really critical for Oculus.