Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Marco Arment

๐Ÿ‘ค Speaker
6218 total appearances

Appearances Over Time

Podcast Appearances

Accidental Tech Podcast
624: Do Less Math in Computers

Yeah. Bringing it back to DeepSeek, though. This is a song about DeepSeek. What this disruption was with DeepSeek coming on the scene and showing this huge reduction in cost, this is just what computer development looks like. We find when we have new...

Accidental Tech Podcast
624: Do Less Math in Computers

Yeah. Bringing it back to DeepSeek, though. This is a song about DeepSeek. What this disruption was with DeepSeek coming on the scene and showing this huge reduction in cost, this is just what computer development looks like. We find when we have new...

Accidental Tech Podcast
624: Do Less Math in Computers

software areas we do things a certain way and then people find optimizations and one of the most delightful things about software development is that when you find optimization oftentimes it's like oh this is now a hundred times faster or It can be dramatically better or dramatically faster or dramatically smaller.

Accidental Tech Podcast
624: Do Less Math in Computers

software areas we do things a certain way and then people find optimizations and one of the most delightful things about software development is that when you find optimization oftentimes it's like oh this is now a hundred times faster or It can be dramatically better or dramatically faster or dramatically smaller.

Accidental Tech Podcast
624: Do Less Math in Computers

Finding new types of compression or faster algorithms that can reduce the order of magnitude of a function. Stuff like that. That's just how computers go. So what's interesting about this deep-seek thing is that this is an area where AI model training and model inference are just so unbelievably inefficient in terms of resources used.

Accidental Tech Podcast
624: Do Less Math in Computers

Finding new types of compression or faster algorithms that can reduce the order of magnitude of a function. Stuff like that. That's just how computers go. So what's interesting about this deep-seek thing is that this is an area where AI model training and model inference are just so unbelievably inefficient in terms of resources used.

Accidental Tech Podcast
624: Do Less Math in Computers

The amount of computing power and just hardware and electrical power and the amount of grunt of resource usage needed to make an LLM do anything or to train an LLM in the first place is so unbelievably massive. that when we find optimizations like this, it shakes the entire market.

Accidental Tech Podcast
624: Do Less Math in Computers

The amount of computing power and just hardware and electrical power and the amount of grunt of resource usage needed to make an LLM do anything or to train an LLM in the first place is so unbelievably massive. that when we find optimizations like this, it shakes the entire market.

Accidental Tech Podcast
624: Do Less Math in Computers

I don't think we've had anything like that in computing for a very long time, where just the normal process of software maturation and software advancement...

Accidental Tech Podcast
624: Do Less Math in Computers

I don't think we've had anything like that in computing for a very long time, where just the normal process of software maturation and software advancement...

Accidental Tech Podcast
624: Do Less Math in Computers

It's like when the BlackBerry CEO thought the iPhone demos were faked.

Accidental Tech Podcast
624: Do Less Math in Computers

It's like when the BlackBerry CEO thought the iPhone demos were faked.

Accidental Tech Podcast
624: Do Less Math in Computers

Yeah, but look, I mean, do you think the U.S. is that far from that? Like, do you think we're that far from, like, it has always been called Mount McKinley? Like, we're not that far from that.

Accidental Tech Podcast
624: Do Less Math in Computers

Yeah, but look, I mean, do you think the U.S. is that far from that? Like, do you think we're that far from, like, it has always been called Mount McKinley? Like, we're not that far from that.

Accidental Tech Podcast
624: Do Less Math in Computers

All right. Thank you to our sponsor this week, The Members. You can become one of The Members by going to atp.fm slash join. One of the biggest benefits of membership is ATP Overtime, our weekly bonus topic. We also have, as mentioned earlier, occasional member specials that are pretty fun and other little perks here and there, the bootleg feed, lots of fun stuff.

Accidental Tech Podcast
624: Do Less Math in Computers

All right. Thank you to our sponsor this week, The Members. You can become one of The Members by going to atp.fm slash join. One of the biggest benefits of membership is ATP Overtime, our weekly bonus topic. We also have, as mentioned earlier, occasional member specials that are pretty fun and other little perks here and there, the bootleg feed, lots of fun stuff.

Accidental Tech Podcast
624: Do Less Math in Computers

So we're sponsored by our members this week. You can be one too, atp.fm slash join. On this week's overtime bonus topic, we'll be talking about the Sonos leadership and kind of upper level shakeup that's been happening and what we think Sonos is, what's going on there and what we think they should do. That'll be an overtime shortly for members and you can hear it too, atp.fm slash join.

Accidental Tech Podcast
624: Do Less Math in Computers

So we're sponsored by our members this week. You can be one too, atp.fm slash join. On this week's overtime bonus topic, we'll be talking about the Sonos leadership and kind of upper level shakeup that's been happening and what we think Sonos is, what's going on there and what we think they should do. That'll be an overtime shortly for members and you can hear it too, atp.fm slash join.

Accidental Tech Podcast
624: Do Less Math in Computers

Thank you everybody for listening and we'll talk to you next week.

Accidental Tech Podcast
624: Do Less Math in Computers

Thank you everybody for listening and we'll talk to you next week.