Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Daily AI Show

OpenAI Garlic Rumors, AI Civil Rights & Nvidia’s New Robotics Model

03 Dec 2025

Transcription

Chapter 1: What is the main topic discussed in this episode?

0.757 - 1.658 Juni

Aloha, everyone.

0

Chapter 2: What is Nvidia's new Alpamayo R1 model for robotics?

1.839 - 27.555 Juni

It's Wednesday, December 3rd, 2025, and this is episode 608. I'm Juni, and I'm with Beth and Andy. Today, we'll cover the AI headlines from the FDA, New York, AWS reInvent, and a bunch of other stories, plus a deeper look at how AI is starting to run real-world systems like fusion reactors, power grids, material labs, and weather models.

0

Chapter 3: How does the AI-powered artificial nose work for odor detection?

27.755 - 35.313 Juni

This is The Daily AI Show. Good morning to both of you and everyone in chat.

0

Chapter 4: What role does agentic AI play in the FDA's internal workflows?

35.995 - 41.631 Juni

So let's do a quick round robin of our favorite stories for today. Andy, what do you got for us?

0

43.096 - 68.759 Andy

Well, let's see. You know, there's this need for AI that's applied to embodied AI or robotics. And NVIDIA is a leader among these things. And I wanted to reintroduce a term that's like LLM or large language model, but it's the type of model that is used for robotics.

0

Chapter 5: How is ByteDance dominating the consumer AI market in China?

68.899 - 102.027 Andy

And it's called a vision language action model, VLA. And NVIDIA just introduced a new such VLA as an open source model called Alpamayo R1. It's designed for things like autonomous driving and robotics, and it integrates visual and textual reasoning to enhance decision making in real world environments. So that's really a new kind of model that combines training for text and

0

102.007 - 125.871 Andy

the L part with vision, you know, in real time view of what's happening around the model and also taking actions in that context. So a whole new kind of thing that has to be built in order to achieve robots like autonomous vehicles and autonomous humanoid robots as well.

0

127.285 - 133.231 Juni

That's pretty cool. And this is NVIDIA's, like, first foray into this kind of model, right?

0

133.271 - 157.174 Andy

They've had, like, Nemotron and other... And Nemotron is their line of models that are similar, but I think this is an open-source vision language action model. I believe that they have been involved in VLAs, you know, in their sort of simulated and digital environments for a long time.

0

Chapter 6: What are the implications of the AI Civil Rights Act reintroduced in Congress?

157.829 - 161.554 Juni

Gotcha. Gotcha. All right. So go ahead.

0

Chapter 7: What does New York's algorithmic pricing law entail?

161.574 - 184.107 Beth

That's interesting because when Meta does open source, right, they do open source because their model comes from, their money model, their revenue model comes from Facebook and the other things that they own for ads and that kind of stuff. But NVIDIA's open source sounds actually kind of really interesting because

0

184.087 - 197.309 Beth

because it's also not associated with their revenue, but it is much closer to their vision in terms of revenue rather than just getting all the eyeballs and selling people the things, you know? Yeah.

0

197.61 - 221.629 Andy

So you can imagine they want, I think their motive is that they want robotics developers to use their VLA, and then that puts them in a good position to sell them the Thor, which is their version of the small mobile compute package that can operate an autonomous robot.

0

222.33 - 223.792 Juni

Yeah. Gotcha.

0

Chapter 8: How is AI being integrated into fusion reactors and scientific control systems?

223.993 - 229.18 Juni

Yeah, an entry point. I can see that. Beth, what store do you have right there?

0

229.514 - 263.841 Beth

I have a story that was highlighted in Science earlier this month. It is a new artificial nose that can detect certain smells for people who don't have a really strong sense of smell. And my older sister... It is one of those people. She cannot detect the scent of like a leaking gas stove and those kinds of things. And that has been a real thing that has made her concerned in her life. Right.

0

263.821 - 290.2 Beth

So it's a little device that tells you when an odor is present, kind of the same way that a cochlear implant converts sound, but for smell. Now, the Science article didn't mention the AI in the article, but we do know that the Daegu Kyungbuk Institute of Science and Technology did also build a next-gen AI-powered e-nose device

0

290.18 - 309.218 Beth

And basically what it does is the sensor arrays light up and then that creates a pattern and machine learning learns to read the pattern, which then interprets, yes, this is a smell that is a problem and you should be aware of it. Or no, this is a normal kind of thing, not a problem.

0

309.198 - 323.545 Beth

So I am making the leap that the science article is talking about a nose that also uses the same kind of thing with a sensor array and then pattern recognition. But it's very exciting.

324.335 - 343.238 Juni

I agree. I agree. Cybernose, right? Cybernose. They've been developing various prosthetics and various sensors. So now we've got smell, just a few more senses to work on, and then we can have a complete package. There you go.

343.285 - 371.272 Beth

And I don't know, like I might prefer, right? I might prefer not to have the sensory scent thing stuck in the middle of my face. Am I having to actually smell it in order to make an assessment, right? Like this is like when we did the cockroach show on the Sci-Fi AI show, right? Like send the cockroach in, get it to do the sensing, and then we'll see how many humans we're going to put on the job.

372.033 - 403.009 Juni

Right, right, right, right, right. Okay, well, my contribution or at least one of them for today is I found this story about the FDA. And so FDA, it's one of the larger government agencies. And so the Food and Drug Administration or the US Food and Drug Administration has announced a broad deployment of what it calls agentic AI tools for all agency employees.

403.192 - 428.479 Juni

It defines agentic AI systems as those that plan and execute multi-step actions to achieve specific goals with built-in guidelines and human oversight. These internal tools will help staff with workflows like meeting management, pre-market reviews, validation, post-market surveillance, inspections, and routine administrative tasks.

Comments

There are no comments yet.

Please log in to write the first comment.