Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Daily AI Show

From Pokémon Go to Open Jarvis

16 Mar 2026

Transcription

Chapter 1: What is the main topic discussed in this episode?

0.031 - 26.781

Hey, how's it going, everybody? Today is March 16th, 2026, and this is the Daily AI Show Live. It's Monday. We're back for another full week of shows. Really excited to have everybody join along with us, everybody in the comments, as well as everybody from our co-hosts to otherwise. So it should be a good week. I'm looking forward to it, as always. With me today is Andy and Beth, and I'm Brian.

0

27.402 - 45.217

and I think it's probably us. We'll see if Carl pops in the door or not, but otherwise, um, you know, always something interesting going on over the weekend. Usually, you know, when we come back on Mondays, you kind of get that, uh, you know, two days worth of, uh, AI news. AI doesn't really take the weekends off. There's always something going on around the world.

0

45.878 - 66.562

Um, I will tell you one of the stories that, uh, It was interesting. We all remember Pokemon Go. Pokemon Go is still around, obviously. I think it's probably doing very well. It's not really in my... It's not really part of my life, you know, but I definitely remember the big the big, you know, phase or push of it.

0

66.862 - 85.309

But I've also seen people like there's still like tournaments and all this other kind of stuff. Anyway, all that to say that the quote from the I think this is a secret. I'm going to say, yeah, I secret. But I've seen this other places, too, is Pokemon Go turns players into free AI labor. You think, well, how did they do that?

0

85.69 - 106.18

Well, as people were walking around with their phones playing a free game, they were also geolocating all sorts of things in the world where these free Pokemons or they were capturing them or however the game goes. And those were being tagged in different lighting conditions and different weather conditions and otherwise. And I like what they say here.

106.22 - 125.949

The key takeaway, the punchline writes itself. Millions thought they were hunting cartoon creatures while quietly performing unpaid field work for the spatial AI. The old Internet rule still holds when the product is free. The real product is usually you. So I don't know. I don't know if this would really upset people or whatever, but I don't know. Maybe chuckle for some reason.

126.31 - 143.151

Just the idea of like. This really fun, addictive game, you know, I think it was during COVID, was it? And when it was bringing everybody outside and there was like some positive stories about this, you know, people being around just outside enjoying fresh air, but also playing a game with their kids and stuff like that.

Chapter 2: How does Pokémon Go contribute to spatial AI data collection?

144.113 - 165.019

But obviously all this data is being captured one way or the other. So, yeah. I have one of the stories up. Let's bring it up to date because I just thought, well, there you go. And as they said, you know, if it's free, it's you. You're paying for this, you know. I feel like there were there were discussions about that at the time.

0

165.039 - 186.1

And it was the for me, the heyday was pre-COVID and I just never got into it. I couldn't figure out why I would be excited about the things. I might be more excited about it now because it's, you know, augmented reality and like some we should talk about on the show. I mean, I was kind of shocked by this, Beth.

0

186.12 - 201.808

I mean, it just goes to show you never know what whole pockets of communities of people. There's a guy that I follow. His YouTube channel is The Tim Tracker. Anyway, it's just a family. He started it years ago. They've since had two kids, whatever. It's mostly about...

0

201.923 - 227.577

theme parks in Orlando which is how I got involved with watching the channel but he has grown to a point now where this is his full-time job along with his wife and so he got invited out just maybe a couple videos ago out to California around Anaheim for some Pokemon Go event I think it was at the Rose Bowl I could be wrong about that but he was sort of showing what was going on and I think he was paid to go out there and obviously make content about it whatever but

0

227.557 - 245.733

there were people hanging out underneath the shade of a tree. I mean, a lot of people waiting for something, something to drop and do the whole thing. And it was a, it was a, I'm not, I'm certainly not poking fun at it. It was, it was a big community event. Uh, lots of people seem to be having a really good time and also meeting new people. And it was all outside.

245.813 - 259.769

And so I'm like, Hey, you know, this is better than being in some arena with a roof on. At least people are enjoying the environment, you know? So anyway, I'm sure there's people out to listen. We're going to listen to this who like Brian, you're murdering it this time. You don't understand it at all. And that might be the case.

259.849 - 279.291

It really was more about the AI and the idea that people holding up their phones, like doing this free, you know, field work as they called it, which I just thought was like the best name. Like you're just, you're doing AI field work, but you don't know it because you're, you're invested in the game or whatever. And yeah, Yeah. I mean, you're doing it if you drive a Tesla, too.

279.872 - 301.573

And you paid to do that. It's true. You also get a car as opposed to a new augmented reality image that you captured. It makes you wonder, though, why Google has not done something like this. Maybe they have gamified it, but why haven't they gamified Google Maps?

301.672 - 320.629

Because, or Apple for that matter, actually, because, you know, Apple was always sort of like, you know, historically, remember when they put maps? I don't know. It was several years ago now, but there was a point where iPhone wouldn't allow Google Maps on the phone. They wanted you to use Maps, but Maps from Apple was hilariously worse.

Chapter 3: What are the implications of NVIDIA GTC for AI inference?

349.973 - 369.49

I mean, this is like a funky... Not a funky, but this is not like a major company that is using unpaid labor in that kind of way. Like, it could have had real blowback for somebody like Google or Apple, who were the map providers, but... Well, that's a good question, too.

0

369.97 - 389.015

Who gets the geotag training data from the makers of Pokemon Go, which, by the way, I have no idea who that is, who owns the rights to Pokemon and all that. But like who gets that data? Maybe it does flow back to the map makers and, you know, like are they selling it or are they using it internally? They're using it as training data.

0

390.517 - 399.518

For themselves, you're saying you don't think they're selling it like third party, they're not selling the data? Oh, no. I think if you have data you can sell, you eventually sell it.

0

Chapter 4: How are chipmakers preparing for local inference and agentic AI?

400.42 - 434.384

But their system, they're training things as well, I believe. Anyway, interesting story, nonetheless. What piqued your guys' interest coming out of the weekend? Well, today is the launch of GTC, NVIDIA GTC. So GTC is the GPU Technology Conference. It started in 2009. Jensen Wang and NVIDIA are now at the heart of everything AI.

0

434.464 - 465.143

And the cognoscenti are referring to GTC, which is about to launch in a couple of hours, I think. They're calling it the Super Bowl of AI. Or if you want to think about the nostalgia that's present in the long history of GPUs and NVIDIA in the world of gaming preceding that, it's kind of like the Woodstock of AI. So there is going to be...

0

465.123 - 494.59

interesting to see what comes out of it i think the big story is going to be how does how does uh jensen and nvidia present the future of gpus in the context of inference where it's not ideal and and the grok uh aqua hire thing that they did the 20 billion dollars that they gave to grok to bring the inference oriented lpu into the nvidia fold

0

494.57 - 518.485

That suggests that they're going to be talking a lot about inference because, you know, the time when GPUs had to be colossally stacked together in order to do training of models is kind of reducing in its need because there's already these training data centers in the hands of the frontier model developers.

0

518.465 - 549.349

Now the big demand for processing is for actual real-time inference, especially with the advent of everybody's got a claw. Right. So now cloud based inference and or local inference using fast processors on local devices. All of that is what the future of chips and processing and AI is mostly about rather than just training. We've kind of passed through the early phases where.

549.329 - 579.032

putting together major data centers for training AI is the major source of demand and has driven the value of NVIDIA up over $4 trillion. So we're going to hear about what their ideas are that are going to help NVIDIA defend itself against entries like Cerebras, which has a much faster inference chip, their wafer-scale chip technology. And then...

579.012 - 601.358

also rub robotics and autonomous vehicles there's a lot of discussion around those things at gtc go ahead burn no no i just had a question i mean to either of you guys but you know obviously the sort of open claw era let's call it or the you know multi-agents running

602.603 - 623.295

you know, when, when Jensen Huang stands on stage and hold something, you know, the size of my wallet or smaller and it, you know, it's now the most powerful, whatever, whatever, right. We're even smaller than that. And he's like, blah, blah, blah. Well, obviously that, that didn't get manufactured in the last six months, probably. So I'm, I'm curious what you guys think is like,

624.372 - 647.224

For them to be continuously successful, as they have been in NVIDIA or any of the chip manufacturers, they obviously have to be betting on the future. And do you think just based on watching how things were growing, that they were saying to themselves, I'm going to guess over a year ago, and that's probably being generous or understating it, I would say.

Chapter 5: What is Stanford's Open Jarvis and its role in personal AI agents?

712.245 - 738.872

These are, you know, dustproof rooms and the whole thing. I just say that to say like, Hardware is hard for a reason. And so this stuff doesn't just happen overnight. I know they're always continuously evolving at NVIDIA. But how do you guys think they plan for and adjust towards? Or is it the same trajectory regardless with AI? And they're like, this stuff is going to happen in this period.

0

738.852 - 763.189

pacing and yes to us we see the open clause the my clause and we go oh that's new and shiny or whatever but from nvidia standpoint they're like yeah we're not really reacting to that kind of stuff we have a six or ten year model and we just build towards that because that's the eventuality well i beth you you probably have a perspective on this as well but i i think that you know

0

763.878 - 788.907

NVIDIA went through a transition with the GPU, which is their core expertise. That's their core competence. Where originally they were selling GPUs to individuals in the form of a PC, and you add this high-powered GPU in order to have a high-end gaming machine. So that's really where they cut their teeth. So they were selling...

0

789.815 - 804.908

a device that was being ultimately delivered to an individual consumer, a local device. So it's in their nature to anticipate that there's eventually a local computing device that is AI proficient.

0

805.388 - 841.403

It's a combination of a GPU and maybe new architectures that can make this thing happen efficiently and with low power, which is not the case for the GPU, which has this quadratic expansion of computation in order to do the math inside a GPU. they're looking to shrink that down and get it into the form that's low power and highly adapted to doing AI inference on the local device. Right. Okay.

841.423 - 862.187

So I think they're seeing that clearly in their future and that's what they're moving for. Let me just add one quick thing, Beth, and then I'm going to turn it over to you, which is the shrinking it down part is also a natural part of the the overall objective of chip design and manufacture, which is let's get more transistors onto this tiny little piece.

863.128 - 889.927

And so that's, you referred to microns, but it's actually nanometers. Now there's these dyes that do the silicon that originally were at four nanometers and then are three nanometers is the current kind of mass market scale of the of the size of the transistors that are going on to the silicon. And now there are fabs with two nanometers.

890.307 - 917.471

And there's new chips just coming out with two nanometers as their sort of resolution for imprinting onto those chips. So everything's shrinking down. Everything's going local. I think that's a natural part of the progression. Beth? Right. And NVIDIA famously listened to the people who are using their technology back when it was majority gamers. That's a part of the culture.

Chapter 6: How can Claude Code enhance mobile coding experiences?

917.511 - 930.15

It's like they're anticipating what's happening, but they're not anticipating what's... Sorry, they're anticipating what's going to happen, but not in a vacuum. The people who are actually using their systems are...

0

930.13 - 955.902

sending the information back like this is the next thing that we need um and they already have a history of of knowing what signal to pay attention to i i guess so i mean it's it's really fascinating to me and i'm sure that the alien ones when people are mentioning amd in the comments and yes of course there's other companies that aren't just the nvidia um it just happened to be that andy you were you were talking about the conference you know but it's it's fascinating to me because

0

957.13 - 979.016

I don't know how quickly they can invent something new or, you know, like have a new chip or whatever. It's a continuous process. I understand that. But, you know, AI is pivoting. And some of the things we've seen even just in the last six months now, you know, end of 2025 and all of 2026 have really shifted the landscape a bit.

0

979.917 - 1005.992

But if you're those companies and you're not just a software company, you're always hedging your bet on future tech that really hasn't come out yet. And they're doing it successfully and have been doing this successfully for years. To both of y'all's point, it's impressive to me at a really, really large scale. And again, I don't mean to just single out Nvidia, but to be continuously right.

0

1006.092 - 1017.931

And I'm sure they've had their missteps along the way. But to be right like that is is impressive, you know, versus other companies seem to struggle with that same kind of. Oh, yeah.

1017.951 - 1039.331

And when you're a four trillion dollar company, you have plenty of resources not only to spend on people inside who are at the cutting edge of the architectures that you're designing, but also to make investments internally. in all the little companies out there that are at the cutting edge, right? So they are a catchment for the latest and greatest.

1039.852 - 1067.894

It's not surprising to me that they have this almost impregnable position. By the way, Greg mentioned in the chat that there's a rumor even that NVIDIA is going to come out with an x86, which is a CPU architecture as part of that. And because of nano claw or open claw and nano and all the many, many claws, Now the orchestration of multiple agents is a key part of computing.

1068.635 - 1102.562

And that's not a GPU skill. Like the GPU is not the best architecture for that. A CPU is. A CPU does that kind of marshalling much better. And so the design for these agentic computing systems stacks is going to be a combination of a CPU and a GPU as it is right now in your computer, right? So if you buy an M series computer on the same chip, they've got CPU cores and GPU cores.

1102.542 - 1120.629

OK, all of it's stacked together. And that's one of the advances, I think, that Apple has presented when they designed their M series chips. Now, NVIDIA isn't building their own CPUs. They're partnering with Intel to do it. And that just came out as an announcement this past week.

Chapter 7: What are the limitations of using Claude Code on mobile devices?

1182.108 - 1209.563

Intel's supposed failure was that they missed the boat on inference and training. So they did not have the right tool set for the AI companies to build on Intel rather than on NVIDIA. So NVIDIA took all of that market share. Well, now CPUs are becoming more important again. And so NVIDIA is stepping in. I want to just bring this up really quick from Gareth.

0

1209.583 - 1216.25

He said, speaking about NVIDIA, they're probably one of the only large companies that is agile enough to make quick shifts.

0

Chapter 8: How does Google's new embeddings model improve multimodal search?

1216.61 - 1237.325

And Gareth, you're probably right there. And what I would say is, just bouncing off of what you said, Andy, they're so large that I would imagine at any one time, maybe the real answer here is that, no, they don't always guess right. They just always have 10 things moving at once, and one of them is going to be right, or two of those are going to be right.

0

1237.385 - 1257.711

And they're big enough and maybe agile enough that the nine, you'll never hear the light of day. But they're moving on them in case they are, in fact, do it. So it could be that the real answer here is that there are many departments where people are working on stuff at NVIDIA and similar companies that just simply never make it.

0

1257.691 - 1274.113

Had in the starting blocks because that's not the way the wind shifted. That's not the way AI goes. And with AI, you really can't probably just bet on what you can't. You can't just bet on one solution. We talk about that at our level. We talk about that about not going all in and putting all your eggs in one basket for one model.

0

1274.752 - 1296.237

At their level, they're talking about, you know, obviously much, much bigger part of it, which kind of is a quick pivot here because I was going to throw it over to you, Beth, for some of the news you saw. But just since this is still talking about this, did you guys see the news about Z or Zed.ai? They're claiming to be the first to come out with a like sort of claw first model.

0

1296.706 - 1326.797

essentially, and it's called GLM5, New Flagship Model for Chat, Coding, and Agentic Tasks. That came out a few weeks ago, yeah. Oh, this did? Yeah. Yeah, my bad. I'm not that nice. Okay, never mind. It's for whatever reason it hit my news cycle today, but we can move on. No, but the relevance of GLM5 to the CLAW ecosystem, I didn't think of that as something.

1326.817 - 1352.767

But when GLM5 came out, Z.AI went way up on the open source leaderboards. because of the ability for agentic reasoning. So I think that that's a key part of it. Everybody's interested in that agentic reasoning skill set. And I think it maxes out the open source models position on the leaderboards for that. And just curious before, like, had you guys heard of this company before?

1353.007 - 1377.979

I mean, I always think of, like, Z.A or Z.AI. Like, how much did you pay for that one letter, .AI? So me, first of all, who's backing that? Like, is that... This is a Chinese company. This is a Chinese company. Is it? Yeah, so while you were busy doing real work, we were on the show a couple of weeks ago. This is real.

1379.68 - 1404.439

We've been talking about... Had you guys already... When you talked about it a couple weeks ago, were you like, oh, yeah, Z.AI? Because I don't think I've ever heard of this company. But, I mean, that's nothing new. Things miss me all the time. Before it released GLM5, I had never heard of it. Okay. They pay double or more. What, I just said hundreds of thousands? Yeah, probably. Yeah, I mean...

1405.212 - 1429.843

one letter dot ai i don't care what letter it is never mind z uh any one letter i'll say any three letter dot ai domain is yeah no i i don't think f.ai is going for a lot of money right now f that you don't think f.ai i bet she is going for a lot Anyway, all right, well, there's old me.

Comments

There are no comments yet.

Please log in to write the first comment.