Corey
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
We're building models on our own.
We're going to go and we're going to build our own tools.
We're going to look at other people's tools.
We're going to work with third-party model providers.
And I think I've always wondered kind of what is the reasoning behind there?
Is it a matter of choice for users?
All right, so if you're building anything in AI right now, whether it's models, tools, workflows, you're going to want to hear about this.
Dell just dropped something that's honestly in a league all of its own.
The Dell Pro Max with GB10.
That GB stands for Grace Blackwell, which is NVIDIA's next generation architecture.
And here's why you're going to care about that.
This machine looks small, but it's a powerhouse.
You're getting 128 gigs of unified LPDDR5X memory, super low latency, and the brand new NVIDIA GB10 module, which lets you run local inferencing on models all the way up to 200 billion parameters, which is insane.
No cloud queues, no sky-high compute costs, just your own powerhouse.
Personal AI sitting right there on your desk waiting for you.
And here's the wild part.
If you need to go bigger, you can connect two Dell Pro Max units together using the ConnectX 7 SmartNIC and 200 gig networking to scale your workloads bigger.
It's seamless, fast, and designed for serious AI development.
Everything stays local and secure, whether you're building at the edge, handling sensitive data, or just trying to push the limits of what your models can do.
Plus, it comes ready with the full NVIDIA AI software stack and DGXOS, so you get right to work.