Rob Wiblin
๐ค SpeakerAppearances Over Time
Podcast Appearances
willing to be avant-garde are going to be more intellectually avant-garde like like tolerant of like quite a lot of philosophical like reasoning and speculation in a sense i think this might be like what a healthy ea community is it's like an engine that incubates cause areas um at a stage when they're like not very respected they're extremely speculative the methodology isn't
firm yet.
You kind of just have to be extremely altruistic and extremely willing to do unconventional things.
And then like matures those cause areas to the point where they can stand on their own while also being a thing that like many EAs work on.
And I think like digital sentience and maybe like the other things on Will and Tom's list, like space governance and
thinking about value lock-in and stuff like that, are other candidates for EA to kind of incubate the way it incubated worrying about AI takeover, basically.
I think that there are some versions of like the value lock-in concern that go through something else kind of overtly scary and bad happening.
Like one person getting all of the power and that's how like that person's values get locked in and that's how we get value lock-in.
But I think there's a whole spectrum of things that are sort of like...
almost like social media plus plus.
It's sort of like in this distributed way, this technology has made us like meaner to each other and like worse at thinking and has allowed individuals to live in information bubbles of their own creation.
You can imagine AIs getting way better at like creating a curated information bubble for each individual person that allows them to continue believing whatever it is they started believing with like super intelligent help, like preventing them from changing their mind.
And this might be something you think of as an important social problem for the long-run future, even if it doesn't happen via one person getting all the power.
Power is still relatively distributed, but large fractions of society are impervious to changing their mind.
Yeah, absolutely.
And I think even the tamest of EAA cause areas, like global health and development, has a huge dose of this.
I think if you look at GiveWell's cost effectiveness analysis, they have to grapple with how does the value of doubling one's income, if you make a very low amount of money, compare to a certain risk of death or the value of a certain painful disease you could have.
And they have to try and get their answers based on surveys and weird studies people have done.
It's not very rigorous in the end.
And they have to form their judgments and spell out their judgments.