Chris Lattner
๐ค SpeakerAppearances Over Time
Podcast Appearances
But what we're doing, you talk about types, like you can say, look, you can start with the world you already know and you can progressively learn new things and adopt them where it makes sense.
If you never do that,
that's cool.
You're not a bad person.
If you, if you get really excited about it and want to go all the way in the deep end and want to rewrite everything and like whatever, that's cool.
Right.
But I think the middle path is actually the more likely one where it's, um, you know, you, you come out with a new, a new idea and you discover, wow, that makes my code way simpler, way more beautiful, way faster, way, whatever.
And I think that's what people like.
Now, if you fast forward and you said like 10 years out, right.
I can give you a very different answer on that, which is, I mean, if you go back and look at what computers looked like 20 years ago,
every 18 months they got faster for free, right?
Two X faster every 18 months.
It was like clockwork.
It was, it was free, right?
You go back 10 years ago and we entered in this world where suddenly we had multi-core CPUs and we had GPUs.
And if you squint and turn your head, what a GPU is, is it's just a mini core, very simple CPU thing kind of, right?
And so, and 10 years ago it was CPUs and GPUs and graphics.
Today, we have CPUs, GPUs, graphics, and AI.
Because it's so important, because the compute is so demanding, because of the smart cameras and the watches and all the different places the AI needs to work in our lives, it's caused this explosion of hardware.
And so part of my thesis, part of my belief of where computing goes, if you look out 10 years from now, is it's not going to get simpler.