Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

Education Bookcast

86. Learning as information compression

27 Apr 2020

Description

The inspiration for this episode is a rather technical tome entitled Information Theory, Inference, and Learning Algorithms by David MacKay. It's basically an infromation theory / machine learning textbook. I initially got it because it's known to be a rewarding work for the most nerdy people in the machine learning (a.k.a. "artificial intelligence") world, who want to get down to fundamentals and understand how concepts from the apparently seperate fields of information theory and inference interrelate. I haven't finished the book, and as of this writing I'm not actually actively reading it. I still wanted to talk about something from it on the podcast though. In the early chapters of the book, MacKay mentions how learning is, in a way, a kind of information compression. This fascinating idea has been circling in my head for months, and so I wanted to comment on it a bit on this podcast. Enjoy the episode.

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.