Nish Kumar
๐ค SpeakerAppearances Over Time
Podcast Appearances
Is that right?
There's a great line in the book that I think it feels partly like a mission statement for it, which is we need to learn how to control the machine or be controlled by it, right?
And I think a big part of that is
pushing back on the idea that the total takeover of our entire information networks by AI is inevitable.
We hear this all the time.
One of our listeners, Robert, has pointed out, talking about this inevitability of AI, surely that idea is only beneficial to the CEO and shareholders of the AI industry.
It inflates the bubble and fills the pockets of AI bros.
The more these statements are repeated, the more our pension funds will be directed to AI companies.
and lobbying money will spill between the industry and our governments.
It's exactly the same as oil executives saying it's inevitable that we must keep drilling for oil or weapons manufacturers saying arms will be built and sold no matter what.
I thought that Robert has articulated something really, really important and very brilliantly there.
But what do you make of that?
Is there room for one more?
Also, there's a lot to say about that.
But one, my girlfriend would find the idea of me moving to the woods very funny because she's like, where are you going to get your flat whites?
I'm not constitutionally able to live outside of a large city.
But I would also say I've always found this idea that large language models will solve the climate crisis very alarming given the amount of
environmental damage done by the data centers that are used to power these large language models.
And so when people like Sam Altman are pressed on that, they say, well, if we've burned enough energy, we'll come up with a solution to the problem that we have created.
It's a bit of a mad argument, that, isn't it?