David Boree
👤 PersonAppearances Over Time
Podcast Appearances
Her particular belief is that the singularity, when it happens, is going to occur in a flash, kind of like the rapture, and almost immediately lead to the creation of either a hell or a heaven. This will be done by the term they use for this inevitable AI is the singleton. That's what they call the AI god that's going to come about. Her obsession is that
Her particular belief is that the singularity, when it happens, is going to occur in a flash, kind of like the rapture, and almost immediately lead to the creation of either a hell or a heaven. This will be done by the term they use for this inevitable AI is the singleton. That's what they call the AI god that's going to come about. Her obsession is that
Her particular belief is that the singularity, when it happens, is going to occur in a flash, kind of like the rapture, and almost immediately lead to the creation of either a hell or a heaven. This will be done by the term they use for this inevitable AI is the singleton. That's what they call the AI god that's going to come about. Her obsession is that
she has to find a way to make the singleton a nice ai that cares about animals as much as it cares about people right that's her initial big motivation so she starts emailing tomasic with her concerns because she's worried that the other rationalists aren't vegans right and they don't feel like animal welfare is like the top priority for making sure this AI is good.
she has to find a way to make the singleton a nice ai that cares about animals as much as it cares about people right that's her initial big motivation so she starts emailing tomasic with her concerns because she's worried that the other rationalists aren't vegans right and they don't feel like animal welfare is like the top priority for making sure this AI is good.
she has to find a way to make the singleton a nice ai that cares about animals as much as it cares about people right that's her initial big motivation so she starts emailing tomasic with her concerns because she's worried that the other rationalists aren't vegans right and they don't feel like animal welfare is like the top priority for making sure this AI is good.
And she really wants to convert this whole community to veganism in order to ensure that the singleton is as focused on insect and animal welfare as human welfare. And Tomasik does care about animal rights, but he disagrees with her because he's like, no, what matters is maximizing the reduction of suffering.
And she really wants to convert this whole community to veganism in order to ensure that the singleton is as focused on insect and animal welfare as human welfare. And Tomasik does care about animal rights, but he disagrees with her because he's like, no, what matters is maximizing the reduction of suffering.
And she really wants to convert this whole community to veganism in order to ensure that the singleton is as focused on insect and animal welfare as human welfare. And Tomasik does care about animal rights, but he disagrees with her because he's like, no, what matters is maximizing the reduction of suffering.
And like a good singleton will solve climate change and shit, which will be better for the animals. And if we focus on trying to convert everybody in this, the rationalist space to veganism, it's going to stop us from accomplishing these bigger goals, right? This is shattering to Ziz, right? She decides that he doesn't, Thomas doesn't care about good things.
And like a good singleton will solve climate change and shit, which will be better for the animals. And if we focus on trying to convert everybody in this, the rationalist space to veganism, it's going to stop us from accomplishing these bigger goals, right? This is shattering to Ziz, right? She decides that he doesn't, Thomas doesn't care about good things.
And like a good singleton will solve climate change and shit, which will be better for the animals. And if we focus on trying to convert everybody in this, the rationalist space to veganism, it's going to stop us from accomplishing these bigger goals, right? This is shattering to Ziz, right? She decides that he doesn't, Thomas doesn't care about good things.
And she decides that she's basically alone in her values.
And she decides that she's basically alone in her values.
And she decides that she's basically alone in her values.
That sounds like we're on our way. She first considers embracing what she calls negative utilitarianism. And this is an example of the fact that from the jump, this is a young woman who's not well, right? Because her hero is like, I don't know if veganism is necessarily the priority we have to embrace right now. Her immediate goal is to jump to, well, maybe what I should do
That sounds like we're on our way. She first considers embracing what she calls negative utilitarianism. And this is an example of the fact that from the jump, this is a young woman who's not well, right? Because her hero is like, I don't know if veganism is necessarily the priority we have to embrace right now. Her immediate goal is to jump to, well, maybe what I should do
That sounds like we're on our way. She first considers embracing what she calls negative utilitarianism. And this is an example of the fact that from the jump, this is a young woman who's not well, right? Because her hero is like, I don't know if veganism is necessarily the priority we have to embrace right now. Her immediate goal is to jump to, well, maybe what I should do
is optimize myself to cause as much harm to humanity and, quote, destroy the world to prevent it from becoming hell for mostly everyone. So that's a jump, you know? That's not somebody who's doing well, who you think is healthy, right?
is optimize myself to cause as much harm to humanity and, quote, destroy the world to prevent it from becoming hell for mostly everyone. So that's a jump, you know? That's not somebody who's doing well, who you think is healthy, right?