Joscha Bach
๐ค SpeakerAppearances Over Time
Podcast Appearances
And so what we should be doing is we should be working towards creating this equilibrium by working as hard as we can in all possible directions. And at least that's the way in which I understand the gist of effective accelerationism. And so when he asked me what I think about this position, I said, it's a very beautiful position and I suspect it's wrong, but not for obvious reasons.
And so what we should be doing is we should be working towards creating this equilibrium by working as hard as we can in all possible directions. And at least that's the way in which I understand the gist of effective accelerationism. And so when he asked me what I think about this position, I said, it's a very beautiful position and I suspect it's wrong, but not for obvious reasons.
And so what we should be doing is we should be working towards creating this equilibrium by working as hard as we can in all possible directions. And at least that's the way in which I understand the gist of effective accelerationism. And so when he asked me what I think about this position, I said, it's a very beautiful position and I suspect it's wrong, but not for obvious reasons.
And in this tweet, I tried to make a joke about my intuition, about what might be possibly wrong about it. So the Ruckus Baselisk and the Paperclip Maximizers are both boogeymen of the AI doomers. Ruckus Baselisk is the idea that there could be an AI that is going to punish everybody for eternity by stimulating them if they don't help in creating Ruckus Baselisk.
And in this tweet, I tried to make a joke about my intuition, about what might be possibly wrong about it. So the Ruckus Baselisk and the Paperclip Maximizers are both boogeymen of the AI doomers. Ruckus Baselisk is the idea that there could be an AI that is going to punish everybody for eternity by stimulating them if they don't help in creating Ruckus Baselisk.
And in this tweet, I tried to make a joke about my intuition, about what might be possibly wrong about it. So the Ruckus Baselisk and the Paperclip Maximizers are both boogeymen of the AI doomers. Ruckus Baselisk is the idea that there could be an AI that is going to punish everybody for eternity by stimulating them if they don't help in creating Ruckus Baselisk.
It's probably a very good idea to get AI companies funded by going to VCs to tell them... Give us a million dollars or it's going to be a very ugly afterlife. And I think that there is a logical mistake in Ocus Basileus, which is why I'm not afraid of it. But it's still an interesting thought experiment.
It's probably a very good idea to get AI companies funded by going to VCs to tell them... Give us a million dollars or it's going to be a very ugly afterlife. And I think that there is a logical mistake in Ocus Basileus, which is why I'm not afraid of it. But it's still an interesting thought experiment.
It's probably a very good idea to get AI companies funded by going to VCs to tell them... Give us a million dollars or it's going to be a very ugly afterlife. And I think that there is a logical mistake in Ocus Basileus, which is why I'm not afraid of it. But it's still an interesting thought experiment.
I think that there is no retrocausation. So basically when Rokos Baselis is there, if it punishes you retroactively, it has to make this choice in the future. There is no mechanism that automatically creates a causal relationship between you now defecting against Rokos Baselis or serving Rokos Baselis.
I think that there is no retrocausation. So basically when Rokos Baselis is there, if it punishes you retroactively, it has to make this choice in the future. There is no mechanism that automatically creates a causal relationship between you now defecting against Rokos Baselis or serving Rokos Baselis.
I think that there is no retrocausation. So basically when Rokos Baselis is there, if it punishes you retroactively, it has to make this choice in the future. There is no mechanism that automatically creates a causal relationship between you now defecting against Rokos Baselis or serving Rokos Baselis.
After Rokos Baselis is in existence, it has no more reason to worry about punishing everybody else. So that would only work if you would be building something like a doomsday machine, as in Dr. Strangelove, something that inevitably gets triggered when somebody defects.
After Rokos Baselis is in existence, it has no more reason to worry about punishing everybody else. So that would only work if you would be building something like a doomsday machine, as in Dr. Strangelove, something that inevitably gets triggered when somebody defects.
After Rokos Baselis is in existence, it has no more reason to worry about punishing everybody else. So that would only work if you would be building something like a doomsday machine, as in Dr. Strangelove, something that inevitably gets triggered when somebody defects.
And because Hocus Positus doesn't exist yet to a point where this inevitability could be established, Hocus Positus is nothing that you need to be worried about. The other one is the paperclip maximizer, right? This idea that you could build some kind of golem that once starting to build paperclips is going to turn everything into paperclips.
And because Hocus Positus doesn't exist yet to a point where this inevitability could be established, Hocus Positus is nothing that you need to be worried about. The other one is the paperclip maximizer, right? This idea that you could build some kind of golem that once starting to build paperclips is going to turn everything into paperclips.
And because Hocus Positus doesn't exist yet to a point where this inevitability could be established, Hocus Positus is nothing that you need to be worried about. The other one is the paperclip maximizer, right? This idea that you could build some kind of golem that once starting to build paperclips is going to turn everything into paperclips.
And so the effective accelerationism position might be to say that you basically end up with these two entities being at each other's throats for eternity and thereby neutralizing each other. And as a side effect of neither of them being able to take over and each of them limiting the effects of the other, you would have a situation where you get all the nice effects of them, right?
And so the effective accelerationism position might be to say that you basically end up with these two entities being at each other's throats for eternity and thereby neutralizing each other. And as a side effect of neither of them being able to take over and each of them limiting the effects of the other, you would have a situation where you get all the nice effects of them, right?