Chris Lattner
๐ค SpeakerAppearances Over Time
Podcast Appearances
I know your roots, right?
And this is a powerful thing, right?
And so if you go back to Lisp, one of the most powerful things about it is that it said that the metaprogramming and the programming are the same.
And so that made it way simpler, way more consistent, way easier to understand, reason about, and it made it more composable.
So if you build a library, you can use it both at runtime and compile time, which is pretty cool.
Yeah.
Okay, so let's come back to that.
So what is machine learning?
Or what is a machine learning model?
You take a PyTorch model off the internet, right?
It's really interesting to me because what PyTorch and what TensorFlow and all these frameworks are kind of pushing compute into is they're pushing into this abstract specification of a compute problem
which then gets mapped in a whole bunch of different ways.
And so this is why it became a metaprogramming problem.
You want to be able to say, cool, I have this neural net.
Now run it with batch size 1,000.
Do a mapping across batch.
Or, okay, I want to take this problem, now run it across 1,000 CPUs or GPUs.
And so like this problem of like describe the compute and then map it and do things and transform it are like, actually it's very profound and that's one of the things that makes machine learning systems really special.
Yeah, well, so what is auto-tuning?
So take a step back.