Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Eliezer Yudkowsky

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1713 total appearances

Appearances Over Time

Podcast Appearances

Also, we could not store the model on all the memory on all the computers in the world as of 2008.

As the saying goes, the map is not the territory, but you can't fold up the territory and put it in your glove compartment.

Sometimes you need a smaller map to fit in a more cramped glove compartment, but this does not change the territory.

The scale of a map is not a fact about the territory, it's a fact about the map.

If it were possible to build in Rana a chromodynamic model of the 747, it would yield accurate predictions.

Better predictions than the aerodynamic model, in fact.

To build a fully accurate model of the 747, it is not necessary, in principle, for the model to contain explicit descriptions of things like airflow and lift.

There does not have to be a single token, a single bit of RAM, that corresponds to the position of the wings.

It is possible, in principle, to build an accurate model of the 747 that makes no mention of anything except elementary particle fields and fundamental forces.

Are you telling me the 747 doesn't really have wings?

It's not just the notion that an object can have different descriptions at different levels.

It's the notion that having different descriptions at different levels is itself something you say that belongs in the realm of talking about maps, not the realm of talking about territory.

It's not that the airplane itself, the laws of physics themselves, use different descriptions at different levels, as yonder artillery gunner thought.

Rather, we, for our convenience, use different simplified models at different levels.

If you looked at the ultimate chromodynamic model, the one that contained only elementary particle fields and fundamental forces, that model would contain all the facts about airflow and lift and wing positions, but these facts would be implicit rather than explicit.

Having figured it out, there would be an explicit representation in your mind of the wing position, an explicit computational object there in your neural RAM.