Zack Kass
👤 SpeakerAppearances Over Time
Podcast Appearances
Every child grows up knowing that Benjamin Franklin discovered electricity and how he did it, right?
The story of the kite and the key.
is almost lore.
And the reason it's lore is that electricity is so important to us.
And AI, I think the history of AI is important, so I'm glad you call it out.
What's important to know about AI is that most of the AI that people have used for the better part of the last 30 years is what we call statistical machine learning.
And statistical machine learning is very simply a massive series of if this, then that rules.
So imagine I were to say to you, Ilana, okay, we're gonna build this statistical machine learning model that is capable of doing this one big complex thing.
What you actually have to do in order to build that model is you have to build a bunch of left turn, right turn rules inside of it.
If Ilana does this, then do this, and what about this, then this.
And it's pretty good at producing, for example, product recommendations.
If someone is this tall or looks like this or has this many friends or shops this often, they're more likely to buy this thing.
That's what statistical machine learning is.
It's not actually reasoning.
It's just big sets of rules.
And that was the prevailing machine learning practice for about 40 years.
And we got better at it because we were building bigger data sets and the compute was getting more efficient, but it was still, it had this like ceiling.
And in 2017, so I was at Lilt at the time, eight Google researchers wrote a paper, attention is all you need.
And in this paper, we called them the Transformer Eight, argued that we were building machine learning models all wrong.
And that instead of building models that think in straight lines, statistical machine learning, we should build models that think in parallel, neural networks.