Rene Haas
👤 PersonAppearances Over Time
Podcast Appearances
I'm not going to say that today, but could we do that?
I hinted in the last conference call that we're looking at going a little bit further than we do today.
Yes, and I also think you have a third bucket where training distills down to simpler training chips that you don't need to run a trillion parameter model.
You can have a giant model that now treats and teaches smaller models.
Mitch, you're experts, 20 billion parameters that can be a mix of inference and training doing reinforcement learning where the chip is now helping learn trained areas.
It's almost like the professor.
teaching a student who can also be a student teacher, who can do a little bit of both.
And then there's inference that over time will be very dedicated, and particularly as you get to endpoints that you can't have a GPU that runs at a kilowatt of power.
It's impossible.
Yeah, physical AI is going to be a gigantic market.
I mean, today, quite candidly, they're using... Bigger than data centers?
Yeah, I think so.
Because I think today they largely use repurposed automotive chips, things that have functional safety, compliance around ADAS, but they're not specific for actuators or specific for smaller parts of the joints.
So physical AI, particularly AI that can learn, is I think going to be a giant market because the robots themselves will have
tens of chips, hundreds of chips.
So yeah, from a unit standpoint, it could be huge.
The numbers are going to be well beyond what we see today.
To some extent, although we don't build anything, our business model is we do the design, someone else has the chip built, mostly at TSMC, some at Samsung, even Intel.
But because we are early in the value chain relative to the software ecosystem, in other words, we probably see what people are doing earlier than anybody else because ultimately we're the link between the hardware and the software.
So on export control, yes, to some extent we have a very big lens into it.