Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Zach Furman

๐Ÿ‘ค Speaker
696 total appearances

Appearances Over Time

Podcast Appearances

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

In the 1960s, Ray Solomonoff formalized this idea into a theory of universal induction which we now call Solomonoff induction.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

He defined the simplicity of a hypothesis as the length of the shortest program that can describe it, a concept known as Kolmogorov complexity.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

An ideal Bayesian learner, according to Solomonoff, should prefer hypotheses, programs, that are short over ones that are long.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This learner can, in theory, learn anything that is computable, because it searches the space of all possible programs, using simplicity as its guide to navigate the infinite search space and generalize correctly.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

The invention of Solomonov induction began a rich and productive subfield of computer science, algorithmic information theory, which persists to this day.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Solomonov induction is still widely viewed as the ideal or optimal self-supervised learning algorithm, which one can prove formally under some assumptions.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

These ideas, or extensions of them like AI, XI, were influential for early deep learning thinkers like Eugen Schmidhuber and Schoenlig, and shaped a line of ideas attempting to theoretically predict how smarter-than-human machine intelligence might behave, especially within AI safety.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Unfortunately, despite its mathematical beauty, Solomonov induction is completely intractable.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Vanilla Solomonov induction is incomputable, and even approximate versions like speed induction are exponentially slow.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Theoretical interest in it as a platonic ideal of learning remains to this day, but practical artificial intelligence has long since moved on, assuming it to be hopelessly unfeasible.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Meanwhile, neural networks were producing results that nobody had anticipated.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

This was not the usual pace of scientific progress, where incremental advances accumulate and experts see breakthroughs coming.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

In 2016, most Go researchers thought human-level play was decades away.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

AlphaGo arrived that year.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Protein folding had resisted 50 years of careful work.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

AlphaFold essentially solved it over a single competition cycle.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

Large language models began writing code, solving competition math problems, and engaging in apparent reasoning, capabilities that emerged from next token prediction without ever being explicitly specified in the loss function.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

At each stage, domain experts, not just outsiders, were caught off guard.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

If we understood what was happening, we would have predicted it.

LessWrong (Curated & Popular)
"Deep learning as program synthesis" by Zach Furman

We did not.