Jeff Dean
👤 PersonAppearances Over Time
Podcast Appearances
pretty much all of deep learning has taken off roughly because of that, because you can build it out of matrix multiplications that are n cubed operations and n squared bytes of
Data communication, basically.
You have to see the insight that seems like it's all about...
all about kind of identifying opportunity costs.
Like, okay, this is something like Larry Page, I think, used to always say, like, our second biggest cost is taxes and our biggest cost is opportunity costs.
And if he didn't say that, then I've been misquoting him for years.
But...
But basically, it's like, what is the opportunity that you have that you're missing out on?
And in this case, I guess it was that, OK, you've got all of this chip area, and you're putting a very small number of arithmetic units on it.
fill the thing up with arithmetic units, you could have orders of magnitude more arithmetic getting done.
Now what else has to change?
Okay, the algorithms and the data flow and everything else.
Okay, data flow is extremely cheap and arithmetic is not cheap.
What would AI look like today?
That's interesting.
Yeah, I mean, I think it might look more like AI looked like 20 years ago, but in the opposite direction.
I'm not sure.
I guess I joined Google Brain in 2012.
I'd left Google for a few years, happened to go back for lunch to visit my wife.
And we happened to sit down next to Jeff and the early Google Brain team.