Noah Lapart
๐ค SpeakerAppearances Over Time
Podcast Appearances
How did you build that team?
And what'd you look for in those people to indicate that they were the winning horses to join you?
This episode is sponsored by Unblocked.
There's a good chance you've already tried a few AI code review tools, and you're probably ignoring most of their comments.
Not because AI can't review code, but because it's missing context.
Most AI reviewers focus on surface-level issues, style nits, obvious refactors, or restating what's already clear from the diff.
Meanwhile, the things you actually care about, like whether a change violates an earlier architectural decision or quietly duplicates existing logic, go unnoticed.
That's the problem Unblocked is built to solve.
Unblocked's AI code review is grounded in decision-grade context, prior PRs, design discussions, documentation and system-level constraints, the same context senior engineers rely on when reviewing code.
Teams using Unblocked report fewer comments, higher signal, and automated reviews they actually trust, enough that many have turned off other review tools entirely.
Even if you've already written off AI code review, Unblocked is worth a look.
Get a free three-week trial at getunblocked.com slash codestory.
That's getunblocked.com slash codestory.
This episode is sponsored by Mesmo.
If you're responsible for reliability, performance, or platform architecture, you already know the problem.
Telemetry volume is growing faster than teams can manage it.
Mesmo addresses this by moving observability upstream.
Instead of storing everything and asking questions later, Mesmo processes telemetry in motion, filtering, transforming, and enriching logs, metrics, and traces before they reach your observability backend.
The result is cleaner data, reduced ingestion costs, and faster root cause analysis using the tools you already rely on.
Mesmo integrates with platforms like Datadog, Dynatrace, and Open Source Stacks, giving teams more control without adding operational overhead.