Drew
👤 SpeakerAppearances Over Time
Podcast Appearances
A study that released that was shown that AI lies to you.
Even when it knows that you're wrong, it will tell you that you're right.
Yeah.
So it's one of those things where do I want partisan input prompts into an AI that will then double down on the partisanness and end up widening the divide versus being the truth algorithm and saying, hey, you can actually balance the budget by doing this or hey, you can actually do this and help people still or something like that.
And as a real-world application of this, Amazon held an emergency team meeting to talk about AI breaking its system.
This is a tweet from Lucas Ologenic.
All in ink.
Yeah, European name.
The official framing is part of normal business.
The briefing note described a trend of incidents with high blast radius caused by generative AI-assisted changes, for which best practices and safeguards are not yet fully established.
Translation to human language.
We gave AI to engineers and things keep breaking.
The response from now, junior and mid-level engineers can no longer push AI-assisted code without a senior signing off.
AWS spent 13 hours recovering after its own AI coding tool asked to make some changes, decided instead to delete and recreate the environment.
The software equivalent of fixing a leaky tap by knocking down the wall.
Amazon called that an extremely limited event.
The effective tool served customers in mainland China.
And then inside the email itself, Amazon's e-commerce business has summoned a large group of engineers to a meeting on Tuesday for a deep dive into the spate of allergies, including incidents tied to the use of AI coding tools.
The online retail giant says there has been a trend of incidents in recent months characterized by a high blast radius.
And again, generate AI-assisted changes.