Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Zach Furman

๐Ÿ‘ค Speaker
696 total appearances

Appearances Over Time

Podcast Appearances

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

In fact, it's a mathematical restatement of a near-trivial fact.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

With exponential resources, one can simply memorize a function's behavior.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The constructions used to prove the theorem are effectively building a continuous version of a look-up table.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This is not an explanation for the success of deep learning.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

It is a proof that if deep learning had to deal with arbitrary functions, it would be hopelessly impractical.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This is not merely a weakness of the UAT's particular proof.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

It is a fundamental property of high-dimensional spaces.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Classical results in approximation theory show that this exponential scaling is not just an upper bound on what's needed, but a strict lower bound.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

These theorems prove that any method that aims to approximate arbitrary smooth functions is doomed to suffer the curse of dimensionality.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

There's a details box here with the title the parameter count lower bound.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The box contents are omitted from this narration.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The real lesson of the universal approximation theorem, then, is not that neural networks are powerful.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The real lesson is that if the functions we learn in the real world were arbitrary, deep learning would be impossible.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The empirical success of deep learning with a reasonable number of parameters is therefore a profound clue about the nature of the problems themselves.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

They must have structure.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The program synthesis hypothesis gives a name to this structure.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Compositionality.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This is not a new idea.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

It is the foundational principle of computer science.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

To solve a complex problem, we do not write down a giant lookup table that specifies the output for every possible input.