Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Chris Lattner

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
2524 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And what that does is that keeps out the little guys.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And sometimes they're not so little guys.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

The big guys are also just not in those dominant positions.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And so what has been happening, and so a lot of you talk about the rise of new exotic crazy accelerators, is people have been trying to turn this from a let's go write lots of special kernels problem into a compiler problem.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And so we, and I contributed to this as well.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

We, as an industry went into it, like, let's go make this compiler problem phase, let's call it.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And much of the industry is still in this phase, by the way.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

So it's, I wouldn't say this phase is over.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And so the idea is to say, look, okay,

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

What a compiler does is it provides a much more general, extensible, hackable interface for dealing with the general case.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And so within machine learning algorithms, for example, people figured out that, hey, if I do a matrix multiplication and I do a ReLU,

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

right?

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

The classic activation function.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

It is way faster to do one pass over the data and then do the ReLU on the output where I'm writing out the data, because ReLU is just a maximum operation, right?

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

Max is zero.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

And so it's an amazing optimization.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

Take Matmul ReLU, squish together in one operation.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

Now we have Matmul ReLU.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

Well, wait a second.

Lex Fridman Podcast
#381 โ€“ Chris Lattner: Future of Programming and AI

If I do that now, I just went from having, you know, two operators to three.