Podcast Appearances
But the idea is that we're going to open it up very quickly for everybody as soon as we're sure that things are just going to scale just fine.
But basically, we will provision an instance.
You authenticate it as a node into your network.
It just shows up.
You can start using it right away.
It's in line with what we already do for Tailscale.
It's free for Homelab use.
We're going to be announcing how we're bundling it as part of the free plan, just free for home use, that kind of thing, just like we do.
Obviously, we're planning on it being a paid product for enterprise, but we're still exploring pricing and all that there.
But I want every home lab, anybody who's playing with LLMs and API keys and stuff at home, they should just be using it.
It just makes things easier, sort of like the tailscale way.
Yeah, so we are hosting these instances for customers right now.
There is plans and talk about self-hosted versions and certainly enterprises.
Some of them would insist on that.
There's varying degrees of what that might mean.
Customers might be like, oh, I want to bring my own cloud.
You just write the logs there, but you can host the actual stuff that's taking up the CPU.
Or some customers might want just like, no, we have to have everything on-prem.
But right now, we wanted to get Aperture into as many hands as possible, as quickly as possible.
And the easiest way to do that, and I think one of the safest ways, frankly, is just to let us host the instances at this point.