Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Zach Furman

๐Ÿ‘ค Speaker
696 total appearances

Appearances Over Time

Podcast Appearances

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Compositional relationships between programs might correspond to some notion of path adjacency defined by the parameter function map.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

If programs sharing structure are nearby, reachable from each other via direct paths, and if simpler programs lie along paths to more complex ones, then efficiency, simplicity bias, and empirically observed stagewise learning would follow naturally.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Gradient descent would build incrementally rather than search randomly.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The enumeration problem that dooms Solomonoff would dissolve into traversal.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This is speculative and imprecise.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

But there's something about the shape of what's needed that feels mathematically natural.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The representation problem asks for a correspondence at the level of objects.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Strata in parameter space corresponding to programs.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The search problem asks for something stronger that this correspondence extends to paths.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Paths in parameter space, what gradient descent traverses, should correspond to some notion of relationship or transition between programs.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This is a familiar move in higher mathematics, sometimes formalized by category theory.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Once you have a correspondence between two kinds of objects, you ask whether it extends to the relationships between those objects.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

It is especially familiar, in fields like higher category theory, to ask these kinds of questions when the relationships between objects take the form of paths in particular.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

I don't claim that existing machinery from these fields applies directly and certainly not given the lack of detail I've provided in this post.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

But the question is suggestive enough to investigate.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

What should adjacency between programs mean?

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Does the parameter function map induce or preserve such structure?

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

And if so, what does this predict about learning dynamics that we could check empirically?

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Heading Appendix Subheading

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Related work.