Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Nathaniel Whittemore

๐Ÿ‘ค Speaker
14492 total appearances

Appearances Over Time

Podcast Appearances

Now, his experiments were in a very different context, but it's not hard to apply this to the conversations about UBI.

I think one can argue that the constant discourse about UBI, the fact that we're effectively trading for the job through which you find meaning for a government handout, doesn't counteract the anticipated decline that motivates political violence.

It's basically AI saying, we agree your labor has no future value.

That's not addressing the domain of lost psychology.

You're telling someone that their economic future is over and handing them a stipend.

That's basically the exact opposite of what research has found actually works.

There's also a dignity dimension that the moral typecasting research illuminates.

When AI leaders propose UBI, they're positioning themselves as the moral agent, the powerful actor making decisions, and the public as moral patients, i.e.

Research shows that that's exactly the type of framing that generates resentment, because being typecast as a patient means being seen as lacking agency.

UBI from the people who are automating your job is the most condensed possible version of that dynamic.

They aren't upset with Palantir because Peter Thiel can afford to eat a thousand burgers to their one.

The whole thing is in large part post-material.

It's the hierarchy and subordination they're uncomfortable with.

They feel their dignity is being trampled and their autonomy progressively diminished.

Rightly or wrongly, they feel politically disenfranchised and stripped of a say over the future.