Eve Bodnia
๐ค SpeakerAppearances Over Time
Podcast Appearances
You don't really play in a guessing game anymore.
You see your energy landscape, you know where the right answer is, and you just know where to go right away.
And this is what's saving your time.
So our models are very small.
We scale in this model from 20 million parameter to 200 million parameter, and there is a range in between.
And we run it on the cheapest H100 GPU.
That's impressive.
Yeah, I was a lot inspired by your brain, right?
As you're speaking to me right now, if somebody says, hey, can I create a digital twin of yours?
And I'm trying to map like your internal body state, the information you process as you're listening to me, your visuals.
It's going to take like enormous amount of GPUs and energy and orchestration, but your brain just naturally can do it like less than 20 Watts.
So if you make the architecture right, well, I'm not comparing people like full disclosure with the machines because we evolved.
like for many years.
So there's some very hardcore evolution behind, but here the idea is that if you make it right, it's not supposed to take you that many GPUs.
Yeah, so it can be quite deep discussion because there is so many different versions of diffusion models.
And to me, to have a technical discussion, I always want to define things first so we don't go on ambiguous, right?
But there's some similarities with diffusion models, obviously.
And the ideas of energy-based models in general, they're not that new.
They've been out there for like 20 years and some early science even 40 years ago.
The problem was nobody tried to build energy-based reasoning model.