George Sivulka
👤 PersonAppearances Over Time
Podcast Appearances
So NVIDIA has, you know, I think that the best moats aren't technological moats. They're not data moats. They're actually people moats. People and networks have the most friction to change. One of the things that NVIDIA does best is the fact that they made this early bet on machine learning.
So NVIDIA has, you know, I think that the best moats aren't technological moats. They're not data moats. They're actually people moats. People and networks have the most friction to change. One of the things that NVIDIA does best is the fact that they made this early bet on machine learning.
So NVIDIA has, you know, I think that the best moats aren't technological moats. They're not data moats. They're actually people moats. People and networks have the most friction to change. One of the things that NVIDIA does best is the fact that they made this early bet on machine learning.
They created CUDA, which is the way that, as I mentioned before, almost everyone learns how to train models. Like they learn how to, you know, how to interface with NVIDIA chips for training.
They created CUDA, which is the way that, as I mentioned before, almost everyone learns how to train models. Like they learn how to, you know, how to interface with NVIDIA chips for training.
They created CUDA, which is the way that, as I mentioned before, almost everyone learns how to train models. Like they learn how to, you know, how to interface with NVIDIA chips for training.
As you're starting to see, maybe that prediction that I made earlier, the shift away from training to inference as a fundamental, almost macro shift in how people deploy AI, I actually think that will destabilize slightly the dominance of NVIDIA chips.
As you're starting to see, maybe that prediction that I made earlier, the shift away from training to inference as a fundamental, almost macro shift in how people deploy AI, I actually think that will destabilize slightly the dominance of NVIDIA chips.
As you're starting to see, maybe that prediction that I made earlier, the shift away from training to inference as a fundamental, almost macro shift in how people deploy AI, I actually think that will destabilize slightly the dominance of NVIDIA chips.
You can start to actually use AMD chips or even custom architectures, which all the major model providers are also currently exploring to do inference So you have your academics and your researchers, you know, training large models on NVIDIA chips. But the minute they deploy them, they can deploy them on cheaper infrastructure. And that actually I think it will be a big change.
You can start to actually use AMD chips or even custom architectures, which all the major model providers are also currently exploring to do inference So you have your academics and your researchers, you know, training large models on NVIDIA chips. But the minute they deploy them, they can deploy them on cheaper infrastructure. And that actually I think it will be a big change.
You can start to actually use AMD chips or even custom architectures, which all the major model providers are also currently exploring to do inference So you have your academics and your researchers, you know, training large models on NVIDIA chips. But the minute they deploy them, they can deploy them on cheaper infrastructure. And that actually I think it will be a big change.
So I'm actually still bullish on NVIDIA, but I'm even more bullish on other chip makers and custom ASICs to do inference because I think there will be a larger shift to inference moving forward.
So I'm actually still bullish on NVIDIA, but I'm even more bullish on other chip makers and custom ASICs to do inference because I think there will be a larger shift to inference moving forward.
So I'm actually still bullish on NVIDIA, but I'm even more bullish on other chip makers and custom ASICs to do inference because I think there will be a larger shift to inference moving forward.
probably be large tech providers and AMD. I don't know about Intel, right? I would probably bet on them. There's definitely an opportunity in the market, but chips are hard.
probably be large tech providers and AMD. I don't know about Intel, right? I would probably bet on them. There's definitely an opportunity in the market, but chips are hard.
probably be large tech providers and AMD. I don't know about Intel, right? I would probably bet on them. There's definitely an opportunity in the market, but chips are hard.
I think that 90% of the market is still in experimental budget phase, but we're starting to see early promises of actual value. And my entire business is focused on just those repeatable use cases.
I think that 90% of the market is still in experimental budget phase, but we're starting to see early promises of actual value. And my entire business is focused on just those repeatable use cases.