Beth Lyons
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
Do you have it on your phone?
I was like, damn.
LFM 2.5 to 1.2 billion dash thinking.
And it's a reasoning model.
It's a thinking model that can run entirely on a phone with less than one gigabyte of memory.
And this is starting again to move, starting again, is that even, does that make sense?
This is the start, the second, third, fourth start of moving from AI in the cloud and being able to have it on the edge device.
And this is like what they're touting is that it's on your phone, but it could also be on a Raspberry Pi or something else that was a lower Raspberry Pi, not the one with the AI hat and the 16 gigs.
So this was trained using thinking traces, or it produces its content using thinking traces, which is similar to OpenAI's O whole series, but compressed for local hardware.
And it's a breakthrough in privacy because your local hardware ideally doesn't share to the cloud what you're doing.
I think that means you don't have Meta on your phone, but okay.
Because Meta is a greedy spy.
And you're not worried about paying a monthly subscription fee for API costs.
Oh, absolutely.
And you have been able to run local models through like Ollama that you can install on your phone, but they haven't been reasoning models.
They've been much, much diminished.
There's a reason that reasoning models are reasoning models.
And
They're limited in terms of how much you can use them unless you pay for a much higher subscription.
But that's part of what this release is because it rivals much larger cloud-based models like Quen 3.