Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Sam Harris

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
4067 total appearances

Appearances Over Time

Podcast Appearances

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Yeah.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Yeah, I mean, the things that worry me the most are the people, I mean, among the things that worry me the most, one is the testimony of the people, again, who are close enough to the technology to be totally credible.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

who won't concede any of these fears right i mean so it's um it's the people who they don't it's like it's weird you'll hear sam talk about the risk he just said didn't interview in the last couple days and he talked about the risks of a major cyber event this year yeah he's an unusual i mean he's an unusual voice in that he will if you

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

I haven't seen him lately asked this question, but you know, the last time I saw him asked point blank about the alignment problem, he totally concedes that it's a problem.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Right.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

So, and so like there's, there's, there's the way in which this could go completely off the rails and it's, you know, this is intrinsically dangerous if not aligned.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Right.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Yeah.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

I mean, just probabilistically, you have to imagine there are more ways to build super intelligent AI that are unaligned than aligned, right?

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

So if we haven't figured out the principle by which we would align it, the idea that we're going to do it by chance seems far-fetched.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

That's right.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

When you ask someone like Sam Altman, what's the probability we're going to destroy everything with this technology?

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

And the answer is like between 10 and 20% or 30%.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

No one's saying one in a million.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Well, there are five people.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

I mean, there's only something like, I mean, you can count on one or at most two hands, the number of people whose minds would have to change so as to solve this coordination problem, at least in America.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Sam Altman has represented his situation.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

I don't know if this is honest, maybe, but for years he's been saying when asked, you know, regulate me.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Like, yeah, you know, I can't do this myself.

Making Sense with Sam Harris
#469 โ€” Escaping an Anti-Human Future

Yes.