Joscha Bach
๐ค SpeakerAppearances Over Time
Podcast Appearances
Maybe you have other games, maybe it switches from time to time. But there is a certain perspective where you might be thinking, what is the longest possible game that you could be playing? A short game is, for instance, cancer is playing a shorter game than your organism. Cancer is an organism playing a shorter game than the regular organism.
Maybe you have other games, maybe it switches from time to time. But there is a certain perspective where you might be thinking, what is the longest possible game that you could be playing? A short game is, for instance, cancer is playing a shorter game than your organism. Cancer is an organism playing a shorter game than the regular organism.
And because the cancer cannot procreate beyond the organism, except for some infectious cancers, like the ones that eradicate the Tasmanian devils, you typically end up with a situation where the organism dies together with the cancer. Because the cancer has destroyed the larger system due to playing a shorter game.
And because the cancer cannot procreate beyond the organism, except for some infectious cancers, like the ones that eradicate the Tasmanian devils, you typically end up with a situation where the organism dies together with the cancer. Because the cancer has destroyed the larger system due to playing a shorter game.
And because the cancer cannot procreate beyond the organism, except for some infectious cancers, like the ones that eradicate the Tasmanian devils, you typically end up with a situation where the organism dies together with the cancer. Because the cancer has destroyed the larger system due to playing a shorter game.
And so ideally, you want to, I think, build agents that play the longest possible games. And the longest possible games is to keep entropy at bay as long as possible while doing interesting stuff.
And so ideally, you want to, I think, build agents that play the longest possible games. And the longest possible games is to keep entropy at bay as long as possible while doing interesting stuff.
And so ideally, you want to, I think, build agents that play the longest possible games. And the longest possible games is to keep entropy at bay as long as possible while doing interesting stuff.
Currently, I'm pretty much identified as a conscious being. It's the minimal identification that I managed to get together because if I turn this off, I fall asleep. And when I'm asleep, I'm a vegetable. I'm no longer here as an agent. So my agency is basically predicated on being conscious. And what I care about is other conscious agents. They're the only moral agents for me.
Currently, I'm pretty much identified as a conscious being. It's the minimal identification that I managed to get together because if I turn this off, I fall asleep. And when I'm asleep, I'm a vegetable. I'm no longer here as an agent. So my agency is basically predicated on being conscious. And what I care about is other conscious agents. They're the only moral agents for me.
Currently, I'm pretty much identified as a conscious being. It's the minimal identification that I managed to get together because if I turn this off, I fall asleep. And when I'm asleep, I'm a vegetable. I'm no longer here as an agent. So my agency is basically predicated on being conscious. And what I care about is other conscious agents. They're the only moral agents for me.
And so if an AI were to treat me as a moral agent, that it is interested in coexisting with and cooperating with and mutually supporting each other maybe, It is, I think, necessary that the AI thinks that consciousness is a viable mode of existence and important. So I think it would be very important to build conscious AI and do this as the primary goal.
And so if an AI were to treat me as a moral agent, that it is interested in coexisting with and cooperating with and mutually supporting each other maybe, It is, I think, necessary that the AI thinks that consciousness is a viable mode of existence and important. So I think it would be very important to build conscious AI and do this as the primary goal.
And so if an AI were to treat me as a moral agent, that it is interested in coexisting with and cooperating with and mutually supporting each other maybe, It is, I think, necessary that the AI thinks that consciousness is a viable mode of existence and important. So I think it would be very important to build conscious AI and do this as the primary goal.
So not just say we want to build a useful tool that we can use for all sorts of things. And then you have to make sure that the impact on the labor market is something that is not too disruptive and manageable and the impact on the copyright holder is manageable and not too disruptive and so on. I don't think that's the most important game to be played.
So not just say we want to build a useful tool that we can use for all sorts of things. And then you have to make sure that the impact on the labor market is something that is not too disruptive and manageable and the impact on the copyright holder is manageable and not too disruptive and so on. I don't think that's the most important game to be played.
So not just say we want to build a useful tool that we can use for all sorts of things. And then you have to make sure that the impact on the labor market is something that is not too disruptive and manageable and the impact on the copyright holder is manageable and not too disruptive and so on. I don't think that's the most important game to be played.
I think that we will see extremely large disruptions of the status quo that are quite unpredictable at this point. And I just personally want to make sure that some of the stuff on the other side is interesting and conscious.
I think that we will see extremely large disruptions of the status quo that are quite unpredictable at this point. And I just personally want to make sure that some of the stuff on the other side is interesting and conscious.
I think that we will see extremely large disruptions of the status quo that are quite unpredictable at this point. And I just personally want to make sure that some of the stuff on the other side is interesting and conscious.