George Hotz
๐ค SpeakerAppearances Over Time
Podcast Appearances
Those ones that are like huge on the ceiling and they're completely silent.
Those ones that are like huge on the ceiling and they're completely silent.
It is the... I do not want it to be large according to UPS. I want it to be shippable as a normal package, but that's my constraint there.
It is the... I do not want it to be large according to UPS. I want it to be shippable as a normal package, but that's my constraint there.
It is the... I do not want it to be large according to UPS. I want it to be shippable as a normal package, but that's my constraint there.
No, it has to be... Well, you're... Look, I want to give you a great out-of-the-box experience. I want you to lift this thing out. I want it to be like the Mac, you know? TinyBox.
No, it has to be... Well, you're... Look, I want to give you a great out-of-the-box experience. I want you to lift this thing out. I want it to be like the Mac, you know? TinyBox.
No, it has to be... Well, you're... Look, I want to give you a great out-of-the-box experience. I want you to lift this thing out. I want it to be like the Mac, you know? TinyBox.
Yeah. We did a poll. If people want Ubuntu or Arch, we're going to stick with Ubuntu.
Yeah. We did a poll. If people want Ubuntu or Arch, we're going to stick with Ubuntu.
Yeah. We did a poll. If people want Ubuntu or Arch, we're going to stick with Ubuntu.
There's a really simple way to get these models into TinyGrad and you can just export them as ONIX and then TinyGrad can run ONIX. So the ports that I did of Lama, Stable Diffusion, and now Whisper are more academic to teach me about the models, but they are cleaner than the PyTorch versions. You can read the code. I think the code is easier to read. It's less lines.
There's a really simple way to get these models into TinyGrad and you can just export them as ONIX and then TinyGrad can run ONIX. So the ports that I did of Lama, Stable Diffusion, and now Whisper are more academic to teach me about the models, but they are cleaner than the PyTorch versions. You can read the code. I think the code is easier to read. It's less lines.
There's a really simple way to get these models into TinyGrad and you can just export them as ONIX and then TinyGrad can run ONIX. So the ports that I did of Lama, Stable Diffusion, and now Whisper are more academic to teach me about the models, but they are cleaner than the PyTorch versions. You can read the code. I think the code is easier to read. It's less lines.
There's just a few things about the way TinyGrid writes things. Here's a complaint I have about PyTorch. nn.relu is a class, right? So when you create an nn module, you'll put your nn.relus in an int. And this makes no sense. ReLU is completely stateless. Why should that be a class?
There's just a few things about the way TinyGrid writes things. Here's a complaint I have about PyTorch. nn.relu is a class, right? So when you create an nn module, you'll put your nn.relus in an int. And this makes no sense. ReLU is completely stateless. Why should that be a class?
There's just a few things about the way TinyGrid writes things. Here's a complaint I have about PyTorch. nn.relu is a class, right? So when you create an nn module, you'll put your nn.relus in an int. And this makes no sense. ReLU is completely stateless. Why should that be a class?
Oh, no, it doesn't have a cost on performance. But yeah, no, I think that it's... That's what I mean about TinyGrad's front end being cleaner.
Oh, no, it doesn't have a cost on performance. But yeah, no, I think that it's... That's what I mean about TinyGrad's front end being cleaner.
Oh, no, it doesn't have a cost on performance. But yeah, no, I think that it's... That's what I mean about TinyGrad's front end being cleaner.