Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Terence Tao

๐Ÿ‘ค Speaker
2047 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

So the result that I proved, roughly speaking, is that statistically, like 90% of all inputs would drift down to, maybe not all the way to one, but to be much, much smaller than what you started.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

So it's like if I told you that if you go to a casino, most of the time you end up, if you keep playing for long enough, you end up with a smaller amount in your wallet than when you started.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

That's kind of like the result that I proved.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

Well, the problem is that I used arguments from probability theory.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

And there's always this exceptional event.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

So in probability, we have these low, large numbers, which tells you things like if you play a casino with a game at a casino with a losing expectation, over time, you are guaranteed, almost surely, with probability,

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

probably probably as close to 100 as you wish you're guaranteed to lose money but there's always this exceptional outlier like it is mathematically possible that even in the game is the odds are not in your favor you could just keep winning slightly more often than you lose very much like how in navier stokes it could be you know most of the time your waves can disperse there could be just one outlier choice of initial conditions that would lead you to blow up

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

And there could be one outlier choice of a special number that you stick in that shoots off to infinity while all other numbers crash to earth, crash to one.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

In fact, there's some mathematicians, Alex Kontorovich, for instance, who've proposed that actually these Caldatz iterations are like these similar automata.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

If you look at what happened in binary, they do actually look a little bit like

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

like these Game of Life-type patterns.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

And in an analogy to how the Game of Life can create these massive self-applicating objects and so forth, possibly you could create some sort of heavier-than-air flying machine, a number which is actually encoding this machine, whose job it is to encode, is to create a version of itself which is larger.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

heavier-than-air machine encoded in a number that flies forever.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

So Conway, in fact, worked on this problem as well.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

Oh, wow.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

So Conway, so similar, in fact, that was one of my inspirations for the Navi Stokes project, that Conway studied generalizations of the collapse problem where instead of

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

multiplying by three and adding one or dividing by two, you have more complicated branching rules.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

But instead of having two cases, maybe you have 17 cases and then you go up and down.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

And he showed that once your iteration gets complicated enough, you can actually encode Turing machines and you can actually make these problems undecidable and do things like this.

Lex Fridman Podcast
#472 โ€“ Terence Tao: Hardest Problems in Mathematics, Physics & the Future of AI

In fact, he invented a programming language for these kind of fractional linear transformations.