Corey
👤 PersonAppearances Over Time
Podcast Appearances
And I think I've always wondered kind of what is the reasoning behind there?
Is it a matter of choice for users?
All right, so if you're building anything in AI right now, whether it's models, tools, workflows, you're going to want to hear about this.
Dell just dropped something that's honestly in a league all of its own.
The Dell Pro Max with GB10.
That GB stands for Grace Blackwell, which is NVIDIA's next generation architecture.
And here's why you're going to care about that.
This machine looks small, but it's a powerhouse.
You're getting 128 gigs of unified LPDDR5X memory, super low latency, and the brand new NVIDIA GB10 module, which lets you run local inferencing on models all the way up to 200 billion parameters, which is insane.
No cloud queues, no sky-high compute costs, just your own powerhouse.
Personal AI sitting right there on your desk waiting for you.
And here's the wild part.
If you need to go bigger, you can connect two Dell Pro Max units together using the ConnectX 7 SmartNIC and 200 gig networking to scale your workloads bigger.
It's seamless, fast, and designed for serious AI development.
Everything stays local and secure, whether you're building at the edge, handling sensitive data, or just trying to push the limits of what your models can do.
Plus, it comes ready with the full NVIDIA AI software stack and DGXOS, so you get right to work.
If you want cutting-edge AI performance in a compact form factor, check out the Dell Pro Max with NVIDIA GB10.
It's the future, and it's already here.
Check out the link in the description here to go get one today.
You know, we see a lot of that in AI right now with just the conversations we have with the neuron.