Zuzanna Stamirowska
👤 SpeakerAppearances Over Time
Podcast Appearances
This is all right.
You could have a model that's growing technically.
But it's like right now in transformers, it's not the reasoning power doesn't come so much from this.
size per se okay when you think about this we actually do have a lot of compute power because of how much place we have in the brain because of the structure so these models but if you if you look like the brain now i would need to check again but i mean for the synaptic synaptic connections the brain are in the trillions right and i think it could be
Yeah, like 1,000.
Some folks would say it could be, I think, 1,000 trillions or something like this of synaptic connections in the brain.
This gives you a lot of memory and a very efficient structure.
So effectively, you get to something that operationally works like infinite context.
I mean, to be super scientifically precise, yes, context in BDH is limited by the size of your brain.
brain, so the number of neurons and then connections between them.
But this network structure allows you to encode so much.
Exactly.
And then when you think about this, you keep your memory close to the core.
Actually, exactly at the core.
So you don't need to do lookups for technical people.
You don't spend your energy on all of that.
You don't need additional compute for this.
So it becomes very efficient from this point of view.
You have memory directly there.
And yeah, it's like in memory on a chip.