Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Zach Furman

๐Ÿ‘ค Speaker
696 total appearances

Appearances Over Time

Podcast Appearances

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Instead, we write a program.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

we break the problem down hierarchically into a sequence of simple, reusable steps.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Each step, like a logic gate in a circuit, is a tiny lookup table, and we achieve immense expressive power by composing them.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This matches what we see empirically in some deep neural networks via mechanistic interpretability.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

They appear to solve complex tasks by learning a compositional hierarchy of features.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

A vision model learns to detect edges, which are composed into shapes, which are composed into object parts, wheels, windows, which are finally composed into an object detector for a car.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The network is not learning a single, monolithic function.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

It is learning a program that breaks the problem down.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This parallel with classical computation offers an alternative perspective on the approximation question.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

While the UAT considers the case of arbitrary functions, a different set of results examines how well neural networks can represent functions that have this compositional, programmatic structure.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

One of the most relevant results comes from considering Boolean circuits, which are a canonical example of programmatic composition.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

It is known that feedforward neural networks can represent any program implementable by a polynomial-sized Boolean circuit using only a polynomial number of neurons.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This provides a different kind of guarantee than the UAT.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

It suggests that if a problem has an efficient programmatic solution, then an efficient neural network representation of that solution also exists.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This offers an explanation for how neural networks might evade the curse of dimensionality.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Their effectiveness would stem not from an ability to represent any high-dimensional function, but from their suitability for representing the tiny, structured subset of functions that have efficient programs.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The problems seen in practice, from image recognition to language translation, appear to belong to this special class.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

There's a details box here with the title Why Compositionality, specifically.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Evidence from depth separation results.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The box contents are omitted from this narration.