Lennart Heim
👤 PersonAppearances Over Time
Podcast Appearances
Absolutely.
Yeah, I loved it to bring it up because that's actually how it got started for me.
I think in 2020, I think OpenAI put out the blog post, AI and Computing.
And they're like, look, we looked at how much computation we needed to train a system and it's doubling every 3.4 months.
I was like, wow, Moore's law is described as like the fastest exponential ever, doubling every two years.
So doubling every 3.4 months is quite, quite staggering.
And what I then did, which basically also the original story of Epoch, I found some colleagues who did like the study.
I'm like, well, actually, let's add some more systems because the original AI paper wanted like eight or 10 AI systems there and we then upgraded to a couple of hundreds.
And what we then found, okay, it's not doubling every 3.4 months anymore.
It's doubling every six months.
That's still crazy, right?
Crossing like multiple orders of magnitude over the last few years.
Again, going from two GPUs to 200,000 or going from like spending a couple of hundred bucks on computation pointers to literally millions, potentially in the future, billions.
And why are people doing this?
Well, because the systems get better.
It's not like we want to burn money, right?
It's just like we train big AI systems, train in more data.
And if you do this, you just require more computing power.
And this has just been what's been called scaling loss.
And this has been, I think, the easiest way to describe AI progress over the last decade.