Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Max Tegmark

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
2251 total appearances
Voice ID

Voice Profile Active

This person's voice can be automatically recognized across podcast episodes using AI voice matching.

Voice samples: 2
Confidence: High

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

In a nuclear reaction, if each one can make more than one, then you get an exponential growth in that.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

We call it a nuclear explosion.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

All explosions are like that.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

And an intelligence explosion, it's just exactly the same principle, that some amount of intelligence can make more intelligence than that, and then repeat.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

You always get exponentials.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

It's obviously gonna stop when it bumps up against the laws of physics.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

There are some things you just can't do no matter how smart you are, right?

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

Because we don't know the full laws of physics yet, right?

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

Seth Lloyd wrote a really cool paper on the physical limits on computation, for example, if you make it...

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

too much energy into it and the finite space it'll turn into a black hole you can't move information around faster than the speed of light stuff like that but uh it's hard to store way more than than a modest number of bits per atom etc but you know those limits are just astronomically above like 30 orders of magnitude above where we are now so

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

bigger different bigger jump in intelligence than if you go from a from an ant to a human i think of course what we want to do is have have a controlled thing a nuclear reactor you put moderators in to make sure exactly it doesn't blow up out of control right when we do um

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

experiments with biology and cells and so on.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

We also try to make sure it doesn't get out of control.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

We can do this with AI too.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

The thing is, we haven't succeeded yet.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

And Moloch is exactly doing the opposite, just fueling, just egging everybody on faster, faster, faster, or the other company is going to catch up with you or the other country is going to catch up with you.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

We have to want this stuff.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

And I don't believe in this, just asking people to look into their hearts and do the right thing.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

It's easier for others to say that.

Lex Fridman Podcast
#371 โ€“ Max Tegmark: The Case for Halting AI Development

But if you're in a situation where your company is going to get screwed, by other companies that are not stopping, you're putting people in a very hard situation.