Joe Liemandt
👤 SpeakerAppearances Over Time
Podcast Appearances
And they started writing papers 40 years ago that were basically the teacher in front of the classroom is the worst way to teach kids.
And there's all these ways where kids can learn two, five, ten times faster.
And so it's not scalable, so you can't do anything about it.
One of the first papers, there's Bloom's Two Sigma is a seminal paper, and they talk in the paper, well, if everybody had an individualized tutor, it'd be great.
Obviously not feasible.
There's a lot of other things too, mastery-based learning, spatial repetition, all these other concepts, cognitive load theory.
We can go into those, but fundamentally, it just hasn't been possible to do it.
And that's sort of, I guess, where how I came into the alpha story, which was short version of my background was I was an old school AI, not the new neural net stuff.
In the 80s, I actually wrote a paper in high school on AI.
And one of my paragraphs was like neural nets, but they're decades away.
Back then it was all expert systems and ontologies.
Then I went to Stanford and I was actually in a class with Ed Feigenbaum, Professor Feigenbaum, who was the father of expert systems.
And he's talking about how these things are worth millions of dollars if you ever get them to work.
And my classmates and I, a couple of us, we dropped out and started Trilogy.
And our claim to fame in the 90s was we were the first AI product to sell a billion dollars.
But old school, old school stuff.
I was doing Trilogy, private software company.
But then three years ago when Gen AI came out, I was like, oh, neural nets are here.
It's decades here.
And my kids were at Alpha because 10 years ago, I didn't talk about it.