Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing

John Siracusa

๐Ÿ‘ค Person
5126 total appearances

Appearances Over Time

Podcast Appearances

Accidental Tech Podcast
624: Do Less Math in Computers

We're the best at it. ChatGPT is the best. Everyone's got a large diagonal, but we're just a little bit better than all of them. And that's why we need $500 billion or whatever to build new data centers to train the next model, blah, blah, blah, blah.

Accidental Tech Podcast
624: Do Less Math in Computers

We're the best at it. ChatGPT is the best. Everyone's got a large diagonal, but we're just a little bit better than all of them. And that's why we need $500 billion or whatever to build new data centers to train the next model, blah, blah, blah, blah.

Accidental Tech Podcast
624: Do Less Math in Computers

And here comes this Chinese company saying, well, we read all the same papers and we have crappier GPUs and we spent less money, but our thing is basically as good as yours open AI. So what do you think of that?

Accidental Tech Podcast
624: Do Less Math in Computers

And here comes this Chinese company saying, well, we read all the same papers and we have crappier GPUs and we spent less money, but our thing is basically as good as yours open AI. So what do you think of that?

Accidental Tech Podcast
624: Do Less Math in Computers

not only that but like you know running inference on our thing which is like you know executing the ai models and using them for everybody else is way cheaper than your thing yeah everything's cheaper everything about it's cheap it was cheaper to train and it's cheaper to run to actually use um and that's one of the reasons that one of the stock prices that did not take a hit was apple because i guess the theory that like well if inference becomes cheaper and apple likes to do lots of on-device ai uh that's good for apple

Accidental Tech Podcast
624: Do Less Math in Computers

not only that but like you know running inference on our thing which is like you know executing the ai models and using them for everybody else is way cheaper than your thing yeah everything's cheaper everything about it's cheap it was cheaper to train and it's cheaper to run to actually use um and that's one of the reasons that one of the stock prices that did not take a hit was apple because i guess the theory that like well if inference becomes cheaper and apple likes to do lots of on-device ai uh that's good for apple

Accidental Tech Podcast
624: Do Less Math in Computers

Now, it's not like Apple is using deep seek like in their operating system, but just conceptually, if the cost of inference goes down for equal performance, I guess that benefits Apple because they're doing a lot of inference on device or whatever. But we'll see. I think like this, this whole kerfuffle is just kind of I feel like a correction to some inflated stock prices.

Accidental Tech Podcast
624: Do Less Math in Computers

Now, it's not like Apple is using deep seek like in their operating system, but just conceptually, if the cost of inference goes down for equal performance, I guess that benefits Apple because they're doing a lot of inference on device or whatever. But we'll see. I think like this, this whole kerfuffle is just kind of I feel like a correction to some inflated stock prices.

Accidental Tech Podcast
624: Do Less Math in Computers

But in general, being able to do the thing better and for less money with less power is what we expect. with technological progress. What we don't expect is, like, every year it will take even more power and, you know, like, we think things to get better, but keep in mind that DeepSeek is not, like, massively better than OpenAI.

Accidental Tech Podcast
624: Do Less Math in Computers

But in general, being able to do the thing better and for less money with less power is what we expect. with technological progress. What we don't expect is, like, every year it will take even more power and, you know, like, we think things to get better, but keep in mind that DeepSeek is not, like, massively better than OpenAI.

Accidental Tech Podcast
624: Do Less Math in Computers

It's roughly about the same, with some caveats that we'll get to in a little bit. But the whole point is, yeah, it's the same, but cheaper and better and lower power and blah, blah, blah, right? And I'm like, great, that's what I expect. I expect like, you know, the MacBook Air that you can get now should be roughly the same performance as like an old MacBook Pro, right?

Accidental Tech Podcast
624: Do Less Math in Computers

It's roughly about the same, with some caveats that we'll get to in a little bit. But the whole point is, yeah, it's the same, but cheaper and better and lower power and blah, blah, blah, right? And I'm like, great, that's what I expect. I expect like, you know, the MacBook Air that you can get now should be roughly the same performance as like an old MacBook Pro, right?

Accidental Tech Podcast
624: Do Less Math in Computers

But lower power and better, like I expect that to happen. But I guess people were startled that it happened so quickly, especially since open AI has always just been making noises of like the only way we can surpass it.

Accidental Tech Podcast
624: Do Less Math in Computers

But lower power and better, like I expect that to happen. But I guess people were startled that it happened so quickly, especially since open AI has always just been making noises of like the only way we can surpass it.

Accidental Tech Podcast
624: Do Less Math in Computers

01 to make the next generation is for you to give us billions more dollars uh and yeah i apparently even just to do 01 caliber stuff you did not need that much money you just need to be a little bit more clever and the fun thing about the cleverness which we'll get to in a little bit is kind of like the the the saying that like constraints lead to better creative output but

Accidental Tech Podcast
624: Do Less Math in Computers

01 to make the next generation is for you to give us billions more dollars uh and yeah i apparently even just to do 01 caliber stuff you did not need that much money you just need to be a little bit more clever and the fun thing about the cleverness which we'll get to in a little bit is kind of like the the the saying that like constraints lead to better creative output but

Accidental Tech Podcast
624: Do Less Math in Computers

because this Chinese company had to work with previous generation hardware, they were forced to figure out how to extract the maximum performance from this older hardware. They had to make compromises. They had to do approximations. They had to come up with new technologies that said, we can't do it the way OpenAI did it. We don't have the money. We don't have the time.

Accidental Tech Podcast
624: Do Less Math in Computers

because this Chinese company had to work with previous generation hardware, they were forced to figure out how to extract the maximum performance from this older hardware. They had to make compromises. They had to do approximations. They had to come up with new technologies that said, we can't do it the way OpenAI did it. We don't have the money. We don't have the time.

Accidental Tech Podcast
624: Do Less Math in Computers

We don't have the technology. We need to find out a way to

Accidental Tech Podcast
624: Do Less Math in Computers

We don't have the technology. We need to find out a way to