Trenton Bricken
๐ค SpeakerAppearances Over Time
Podcast Appearances
Yeah, you just have more genetic recombination and shots on target.
Yeah.
Yeah, I think physics and math might be slightly different in this regard.
But especially for biology or any sort of wetware, and to the extent we want to analogize neural networks here, it's comical how serendipitous a lot of the discoveries are.
Like penicillin, for example.
This is the Carl Schulman sort of argument of like, we're going to race through the orders of magnitude in the near term, but then longer term, it would, it would be harder.
Exactly.
So GoFi is good old-fashioned AI, right?
And can you define that?
Because when I hear it, I think if-else statements for symbolic logic.
Not only that, but if you believe claims that GPT-4 is around 1 trillion parameter count, I mean, the human brain is between 30 and 300 trillion synapses.
And so that's obviously not a one-to-one mapping and we can debate the numbers, but...
it seems pretty plausible that we're below brand scale still.
Yeah, but the sample efficiency stuff, I never know exactly how to think about it because obviously a lot of things are hardwired in certain ways, right?
And they're like the co-evolution of language and like the brain structure.
So it's hard to say.
Also, there are some results that if you make your model bigger, it becomes more sample efficient.
A lot of models almost have the sample efficiency.
So maybe that also just solves it.
Like you don't have to be more data efficient, but if your model's bigger, then you also just are more data efficient.