Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Nathaniel Whittemore

๐Ÿ‘ค Speaker
4350 total appearances

Appearances Over Time

Podcast Appearances

Disturbing for those of us who identify as software engineers, but no less true.

That's not to say software engineers don't have work to do, but writing syntax directly is not it.

I think overall, trying to sum up, Andrew Curran does a great job.

After discussing the five and the two year timeline prediction for AGI, Curran writes, Dario said that if he had the option to slow things down, he would, because it would give us more time to absorb all the changes.

He said that if Anthropic and DeepMind were the only two groups in the race, he would meet with them right now and agree to slow down.

But there's no cooperation or coordination between all the different groups involved, so no one can agree on anything.

This, in my opinion, is the main reason he wanted to restrict GPU sales.

Chip proliferation makes this kind of agreement impossible.

And if there is no agreement, then he asked to blitz.

This seems to be exactly what he has decided to do.

After watching his interviews today, I think Anthropic is going to lean into recursive self-improvement and go all out from here to the finish line.

They have broken their cups and are leaving all restraint behind them.

Ultimately, folks, last year, one got the sense that the conversations about AI at Davos were still highly theoretical.

This year, I believe there is a different shift, a different confidence in the predictions based on the evidence that we've had of the last year.

On X, Diego Odd wrote, Outside our bubble, most people have absolutely no idea that we could be just six to 12 months away from powerful AI models capable of accelerating progress in a way that resembles a fast takeoff.

Sure, as Dario remarks, there could be physical roadblocks like chips that slow things down.

But again, it's nearer than most people think and the majority of the world is living as if nothing is happening.

In perhaps the truest statement I've read this January, he concludes, 2026 will be a weird year.

Brace yourself for the next generation of models.

That's going to do it for today's AI Daily Brief.