Ed Elson
๐ค SpeakerAppearances Over Time
Podcast Appearances
Because AI is going to completely replace us, which is going to completely eviscerate incomes, completely eviscerate wages.
And meanwhile, as that is happening, consumption of the AI products is going to keep growing and growing and growing.
And this is the part that doesn't really make sense, because how is it?
that you're going to have people who don't have enough money to pay for anything or to consume anything, and yet consumption continues to go up.
And this is the part where he's very descriptive on the value destruction that we might see, but then completely ignores the value creation and what we might do with all of that productivity.
And I think this is the thing that a lot of people are taking issue with.
It's something that I take issue with as well.
I think that there's not enough analysis of what's going to happen on the other side of those accounts.
I mean, if you've got consumption going up, then that necessarily assumes that people have money to pay for things.
But he doesn't really acknowledge that side of the equation.
He only focuses on the downside, which, when you read it, is kind of interesting.
But when you start to logically think it through, it doesn't really make much sense.
I think this is a really important point.
You mentioned you're pulling back on a certain type of legal service and you're spending a lot less money on that because you've got this AI tool that is helpful and you've hired someone who's going to consolidate that work.
But then you're spending more on the corporate restructuring over here.
And that is a dynamic that I think a lot of people are not really recognizing, which is, sure, some money might move out of this ecosystem, but then where is it going to go?
That's the question that people aren't really asking enough and that the Citrini Research article actually refuses to acknowledge at all.
They spend a lot of time saying, here's where the money is going to move away from.
It's going to pull out of here and here and here and here and here.