Demis Hassabis
๐ค SpeakerAppearances Over Time
Podcast Appearances
And then scaling to the max the current capabilities.
And we're still seeing some, you know, fantastic progress on each different version of Gemini.
Well, I mean, if you look at the history of the last decade or 15 years, it's been, you know, maybe, I don't know, 80, 90% of the breakthroughs that underpins modern AI field today was from, you know, originally Google Brain, Google Research and DeepMind.
So, yeah, I would back that to continue, hopefully.
I'm not very worried about that, partly because I think there's enough data and it's been proven to get the systems to be pretty good.
And this goes back to simulations again.
Do you have enough data to make simulations so that you can create more data?
synthetic data that are from the right distribution.
Obviously, that's the key.
So you need enough real world data in order to be able to create those kinds of data generators.
And I think that we're at that step at the moment.
Exactly.
Yeah.
I think so, for several reasons.
I think there's the amount of compute you have for training.
Often it needs to be co-located.
So actually even bandwidth constraints between data centers can affect that.
So there's additional constraints even there.
And that's important for training, obviously, the largest models you can.
But there's also, because now AI systems are in products and being used by billions of people around the world, you need a ton of inference compute now.