David Boree
👤 PersonAppearances Over Time
Podcast Appearances
She's going to make that big tech money while she figures out a startup idea and finds a co-founder who will let her make enough money to change and save the world. Well, the whole universe. Her first plan is to give the money to MIRI, Yudkowsky's organization, so it can continue its important work imagining a nice AI.
She's going to make that big tech money while she figures out a startup idea and finds a co-founder who will let her make enough money to change and save the world. Well, the whole universe. Her first plan is to give the money to MIRI, Yudkowsky's organization, so it can continue its important work imagining a nice AI.
She's going to make that big tech money while she figures out a startup idea and finds a co-founder who will let her make enough money to change and save the world. Well, the whole universe. Her first plan is to give the money to MIRI, Yudkowsky's organization, so it can continue its important work imagining a nice AI.
Her parents, she's got enough family money that her parents are able to pay for like I think like six months or more of rent in the Bay, which is not nothing, not a cheap place to live. I don't know exactly how long her parents are paying, but like that, that implies a degree of financial comfort. Right. Yeah.
Her parents, she's got enough family money that her parents are able to pay for like I think like six months or more of rent in the Bay, which is not nothing, not a cheap place to live. I don't know exactly how long her parents are paying, but like that, that implies a degree of financial comfort. Right. Yeah.
Her parents, she's got enough family money that her parents are able to pay for like I think like six months or more of rent in the Bay, which is not nothing, not a cheap place to live. I don't know exactly how long her parents are paying, but like that, that implies a degree of financial comfort. Right. Yeah.
So she gets hired by a startup very quickly because, again, very gifted computer engineer. Yeah, with a resume. Right? Yes. It's some sort of gaming company. But at this point, she's made another change in her ethics system based on Eliezer Yudkowsky's writings. One of Yudkowsky's writings argues that – it's talking about the difference between consequentialists – and virtue ethics, right?
So she gets hired by a startup very quickly because, again, very gifted computer engineer. Yeah, with a resume. Right? Yes. It's some sort of gaming company. But at this point, she's made another change in her ethics system based on Eliezer Yudkowsky's writings. One of Yudkowsky's writings argues that – it's talking about the difference between consequentialists – and virtue ethics, right?
So she gets hired by a startup very quickly because, again, very gifted computer engineer. Yeah, with a resume. Right? Yes. It's some sort of gaming company. But at this point, she's made another change in her ethics system based on Eliezer Yudkowsky's writings. One of Yudkowsky's writings argues that – it's talking about the difference between consequentialists – and virtue ethics, right?
Consequentialists are people who focus entirely on what will the outcome of my actions be. And it kind of doesn't matter what I'm doing, or even if it's sometimes a little fucked up, if the end result is good. Virtue ethics people have a code and stick to it, right? And
Consequentialists are people who focus entirely on what will the outcome of my actions be. And it kind of doesn't matter what I'm doing, or even if it's sometimes a little fucked up, if the end result is good. Virtue ethics people have a code and stick to it, right? And
Consequentialists are people who focus entirely on what will the outcome of my actions be. And it kind of doesn't matter what I'm doing, or even if it's sometimes a little fucked up, if the end result is good. Virtue ethics people have a code and stick to it, right? And
Actually, and I kind of am surprised that he came to this, Yudkowsky's conclusion is that like while logically you're more likely to succeed – like on paper you're more likely to succeed as a consequentialist. His opinion is that virtue ethics has the best outcome. People tend to do well when they stick to a code and they try to – rather than like anything goes as long as I succeed, right?
Actually, and I kind of am surprised that he came to this, Yudkowsky's conclusion is that like while logically you're more likely to succeed – like on paper you're more likely to succeed as a consequentialist. His opinion is that virtue ethics has the best outcome. People tend to do well when they stick to a code and they try to – rather than like anything goes as long as I succeed, right?
Actually, and I kind of am surprised that he came to this, Yudkowsky's conclusion is that like while logically you're more likely to succeed – like on paper you're more likely to succeed as a consequentialist. His opinion is that virtue ethics has the best outcome. People tend to do well when they stick to a code and they try to – rather than like anything goes as long as I succeed, right?
And I think that's actually a pretty decent way to live your life.
And I think that's actually a pretty decent way to live your life.
And I think that's actually a pretty decent way to live your life.
Yeah. It's a reasonable conclusion for him, so I don't blame him on this part. But here's the problem. Ziz is trying to break into and succeed in the tech industry. And you are very unlikely to succeed at a high level in the tech industry if you are unwilling to do things and have things done to you that are unethical and fucked up. I'm not saying this is good.
Yeah. It's a reasonable conclusion for him, so I don't blame him on this part. But here's the problem. Ziz is trying to break into and succeed in the tech industry. And you are very unlikely to succeed at a high level in the tech industry if you are unwilling to do things and have things done to you that are unethical and fucked up. I'm not saying this is good.