Mark Zuckerberg
๐ค SpeakerAppearances Over Time
Podcast Appearances
but these are just the consumer use cases.
I mean, I think when you think about stuff like, I mean, you know, I run like our foundation, right.
A Chan Zuckerberg initiative with my wife and, you know, we're doing a bunch of stuff on science and, um, and there's obviously a lot of AI work that, where I, that I think is going to advance science and healthcare and all these things too.
So I think that it's like, there's a, this is, I think it ended up affecting basically every area of the products and, and, and the, and the, uh, the economy.
I don't know that we know the answer to that.
So I think one thing that seems to be a pattern is that you have the LAMA model, and then you build some kind of other application-specific code around it.
So some of it is the fine-tuning for the use case, but some of it is just logic for how MetAI should integrate
that should work with tools like Google or Bing to bring in real-time knowledge.
I mean, that's not part of the base Llama model.
That's like part of it.
Okay, so for Llama 2, we had some of that and it was a little more kind of hand engineered.
And then part of our goal for Llama 3 was to bring more of that into the model itself.
And but for Lama three, as we start getting into more of these agent like behaviors, I think some of that is going to be more hand engineered.
And then I think our goal for Lama four will be to bring more of that into the model.
So I think at each point, like at each step along the way, you kind of have a sense of
of what's going to be possible on the horizon.
You start messing with it and hacking around it.
And then I think that that helps you hone your intuition for what you want to try to train into the next version of the model itself.
Interesting.
Which makes it more general because obviously anything that you're hand coding is, you know, you can unlock some use cases, but it's just inherently brittle and non-general.