Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

AXRP - the AI X-risk Research Podcast

15 - Natural Abstractions with John Wentworth

23 May 2022

Description

Why does anybody care about natural abstractions? Do they somehow relate to math, or value learning? How do E. coli bacteria find sources of sugar? All these questions and more will be answered in this interview with John Wentworth, where we talk about his research plan of understanding agency via natural abstractions. Topics we discuss, and timestamps:  - 00:00:31 - Agency in E. Coli  - 00:04:59 - Agency in financial markets  - 00:08:44 - Inferring agency in real-world systems  - 00:16:11 - Selection theorems  - 00:20:22 - Abstraction and natural abstractions  - 00:32:42 - Information at a distance  - 00:39:20 - Why the natural abstraction hypothesis matters  - 00:44:48 - Unnatural abstractions used by humans?  - 00:49:11 - Probability, determinism, and abstraction  - 00:52:58 - Whence probabilities in deterministic universes?  - 01:02:37 - Abstraction and maximum entropy distributions  - 01:07:39 - Natural abstractions and impact  - 01:08:50 - Learning human values  - 01:20:47 - The shape of the research landscape  - 01:34:59 - Following John's work   The transcript: axrp.net/episode/2022/05/23/episode-15-natural-abstractions-john-wentworth.html   John on LessWrong: lesswrong.com/users/johnswentworth   Research that we discuss:  - Alignment by default - contains the natural abstraction hypothesis: alignmentforum.org/posts/Nwgdq6kHke5LY692J/alignment-by-default#Unsupervised__Natural_Abstractions  - The telephone theorem: alignmentforum.org/posts/jJf4FrfiQdDGg7uco/information-at-a-distance-is-mediated-by-deterministic  - Generalizing Koopman-Pitman-Darmois: alignmentforum.org/posts/tGCyRQigGoqA4oSRo/generalizing-koopman-pitman-darmois  - The plan: alignmentforum.org/posts/3L46WGauGpr7nYubu/the-plan  - Understanding deep learning requires rethinking generalization - deep learning can fit random data: arxiv.org/abs/1611.03530  - A closer look at memorization in deep networks - deep learning learns before memorizing: arxiv.org/abs/1706.05394  - Zero-shot coordination: arxiv.org/abs/2003.02979  - A new formalism, method, and open issues for zero-shot coordination: arxiv.org/abs/2106.06613  - Conservative agency via attainable utility preservation: arxiv.org/abs/1902.09725  - Corrigibility: intelligence.org/files/Corrigibility.pdf   Errata:  - E. coli has ~4,400 genes, not 30,000.  - A typical adult human body has thousands of moles of water in it, and therefore must consist of well more than 10 moles total.

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.