Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Daily AI Show

From DeepSeek to Desktop Agents

15 Jan 2026

Transcription

Chapter 1: What is Claude Co-Work and how does it signal AI productization?

0.588 - 32.412 Andy Halliday

good morning everyone and welcome to the daily ai show live uh this is the very first time i i can i don't think it's ever happened before when just one person is available at the start of the show and hopefully others will join me but uh i am not seeing any people joining this particular stream either so it could be a very lonely moment here on show 638 where we haven't missed a one.

0

33.855 - 44.702 Andy Halliday

But it's me, Andy Halliday, and hopefully some of my colleagues will join in a few moments. But I want to kick us off with some discussion about some news items that I think are important.

0

45.323 - 78.36 Andy Halliday

And one of them is, first of all, there's been a lot of discussion over the past several days about cloud co-work, which is a new product leveraging the cloud code infrastructure and a really important movement that enables general development. non-technical users to invoke agency on their desktop in ways that are a lot more user-friendly than the terminal interface that is provided.

0

Chapter 2: How is Anthropic expanding its AI product offerings?

79.181 - 113.583 Andy Halliday

I want to cast that as part of a general direction that's happening with both OpenAI and Google and Anthropic, and that is the productization of AI. So We're seeing this development of easy-to-access products or integrations into existing products using AI. And as we discussed before, at OpenAI, they actually have – a co-CEO kind of arrangement.

0

113.603 - 119.498 Andy Halliday

You've got Sam Altman, and then you have a president of product.

0

Chapter 3: What does Microsoft’s AI Economy Institute report reveal about global AI adoption?

120.18 - 149.617 Andy Halliday

And that person, Fiji Simo, is responsible for this productization of AI. And there's a flurry of different products and soon to include the range of hardware devices that, you know, OpenAI is planning to deliver, including the newly leaked earbud kind of thing that will challenge AirPods as part of that. Hello to everyone in the chat. I see you're all joining us now. It's just me today so far.

0

150.278 - 151.5 Andy Halliday

We'll see who else joins us.

0

Chapter 4: Why does the US rank lower than expected in AI adoption compared to other countries?

152.161 - 185.237 Andy Halliday

But back to the productization effort. Anthropic, like OpenAI, has just expanded their team to include a product organization called Just even a month or two ago, I would have said, look, Anthropic's focusing on enterprise coding, and that's where they're making their difference. They're very efficient there, and they have a pretty substantial revenue stream that's coming from that.

0

185.217 - 211.204 Andy Halliday

that makes them, you know, profitable much earlier than open AI could become with open AI investing heavily in their own compute and so on. Also in their product organization and multiple product ranges that is being matched now by Anthropic that they've expanded their team to incubate experimental products based on their forefront AI capabilities.

0

Chapter 5: What are the efficiency gains from DeepSeek in underserved markets?

212.325 - 238.326 Andy Halliday

And so again, The co-founder of Instagram, Mike Krieger, has joined Anthropic to be in charge of this collaboration around product development using the Anthropic tool set. And then there's another individual who will lead the product organization alongside the CTO. The CTO at Anthrop is named Rahul Patil.

0

238.306 - 262.828 Andy Halliday

And the new lead of the product organization, this is not just the product, which is Anthropix Models, but now new products to be leveraging the Anthropix tools and then delivering to the consuming public and to the enterprise public as well. That's a person named Ami Vora.

0

Chapter 6: How does DeepSeek's conditional memory improve AI reasoning performance?

263.213 - 289.018 Andy Halliday

And so now we have those folks working on new products. And this is all in the context of the recent release of CoWork, which, using Cloud Code, the Anthropic team delivered in under two weeks. So a pretty impressive accomplishment and really changing the game on productization of AI, just as it's possible for you using Vibe coding tools to create a new product

0

288.998 - 318.432 Andy Halliday

a new software product or an interactive experience product on the web using a vibe coding tools. All right. So let's, let's move on to another area of discussion that I want to pursue, which is, you know, what is the state of AI adoption in the general working population and Microsoft's AI economy Institute just a few days ago, released a new report showing that globally the,

0

318.412 - 329.952 Andy Halliday

AI adoption in the working age population is at 16.3% in late 2025. So only 16% have adopted.

0

Chapter 7: What impact do Meta's layoffs have on its AI infrastructure strategy?

330.012 - 356.051 Andy Halliday

And you can imagine why, you know, globally that's, that seems like a low number globally. but not so much because there's a lot of people who just aren't literate enough or have access to the AI infrastructure tools like a computer and internet access to really start the adoption process. But that's very, very widely distributed in terms of those percentages.

0

356.351 - 388.497 Andy Halliday

The United Arab Emirates is the highest level of adoption among working age population at 64%. And would we expect that the United States, with all of its AI systems and Silicon Valley and development, all the major players, including the chip designer, NVIDIA, our American companies, you would think that we would be next in line behind the United Arab Emirates with their 64%.

0

389.899 - 410.4 Andy Halliday

Want to guess where we rank? 24th in the world. And so we're not really keeping up that way on the public side, the working age population compared to other countries that are really promoting and advancing the use of AI.

0

Chapter 8: Why is systems thinking crucial for effective AI integration in organizations?

411.072 - 433.553 Andy Halliday

All right. The other thing that came out of that Microsoft AI Economy Institute report, which is interesting, is that deep seek has major traction in the smaller and underserved markets out there. And that's a cost issue, obviously, like many of the adopters of AI here in the United States and in Europe and elsewhere are paying for.

0

433.533 - 456.635 Andy Halliday

the major companies or using the free version or paying $20 a month, roughly to get access to those models. Whereas deep seek is less expensive as an open source model. And it is used more, Two to four times higher across the African continent than the other ones.

0

457.116 - 482.575 Andy Halliday

And a Chinese company, Huawei, which has a lot of the telephone infrastructure, mobile phone infrastructure in those developing countries. is also partnered with DeepSeek to advance the use of DeepSeek in those countries. Now, I'm weaving over to something about DeepSeek on the technical side. So take notes. There'll be a quiz on this afterward.

0

483.955 - 499.602 Andy Halliday

DeepSeek has just introduced a new technique in LLM inference that's advancing its capability in pure reasoning in a dramatic way.

0

499.582 - 530.986 Andy Halliday

So, you know, the innovations that the Chinese companies starved of the sort of the scaling compute capabilities available, if you can acquire the top end data center infrastructure like the NVIDIA Blackwell chips and so on, they've innovated around efficiencies that are along two different dimensions and, I'll circle back to this, but one of those two dimensions is the use of sparsity.

531.767 - 552.787 Andy Halliday

Now, sparsity is the opposite of dense in the terminology of AI. Dense means that you're using every layer of the network in each inference run. That's a dense, deep neural network. And sparsity means you're only activating certain portions of it.

553.328 - 585.788 Andy Halliday

So if you have a 100 billion parameter model, any one inference run is dynamically assessing which portions of that deep neural network, the LLM, which layers of those have to be activated. And this has given rise to the primary architecture for LLMs today, which is called mixture of experts. So the only experts that are activated in this context are the ones which are relevant to the query.

586.408 - 617.473 Andy Halliday

And that reduces the computational overhead and it makes for a more efficient and effective inference run and reduces the cost in both energy and compute time and allows for a larger context window to be executed. So all of those things are improving on the efficiency scale. The second dimension has to do with memory. And this is where the new deep seek technique comes in. So.

617.858 - 644.524 Andy Halliday

We know that models just left as a dense model and being injected with your prompt and some additional context that you type in at the time of inference, they can be subject to hallucinations. And so we like to ground that with a retrieval augmented generation model where you have an external memory, a database that is going to be referenced as context based.

Comments

There are no comments yet.

Please log in to write the first comment.