Jaden Schaefer
๐ค SpeakerAppearances Over Time
Podcast Appearances
All right, let's get into the first story, which is Google Gemini.
I want to talk about their new open source situation with Gemini 4.
So earlier this week, they released Apache 2.0 license.
And basically, this is their latest family of open models built specifically for reasoning and agentic workflows.
What I think is really interesting about Gemini 4 is what Google is calling, you know, the best intelligence per parameter ratio in any open model right now.
Basically, you're getting the frontier level capabilities of what you'd expect out of something like Cloud or ChatGPT without needing a massive hardware setup.
You know, something like Lama 4 Maverick requires that huge hardware setup.
And so you're basically getting around that.
The model already has over 400 million downloads and the community has spun up over 100,000 variants, which I think just kind of tells you how quickly developers are adopting this.
I think the significance is that it's less about kind of the benchmarks and it's more about the trend, right?
The gap between open source and closed source models is definitely shrinking.
And I think that Gemini 4 is just another data point in that direction.
The, like, if we want to get into kind of the licensing on this, the Apache 2.0 license is also really important because it means that companies can actually use this commercially without worrying about any sort of restrictive terms.
I remember when Lama first came out for a meta and they were like, look, it's like an open source model.
And it's like, well, it's not really open source.
It's just like, you know, open weight and like you can use it.
But if you really want to use it for something commercial, you got to let us know.
And there was like all this kind of, I don't know, it was very unclear.
And I think Meta is just getting right around this.
For anyone that's building agents or doing, you know, reasoning, heavy work on their infrastructure, this is probably the most capable option that you can run locally right now.