Tobias "Tobi" Konitzer
๐ค SpeakerAppearances Over Time
Podcast Appearances
It was meant for a sample size of 100 people.
We tried to apply it to sample size of millions.
And that was a heart constraint.
And what we did back then is we somehow managed to get this thing to the graphical processing unit when these guys came up on AWS and made it scale.
That was probably the wrong way to do that because it burned a lot of resources.
I was way too stubborn to let go of this beautiful Bayesian, purest Bayesian implementation.
And even then, there was a way things scaled.
So I think it's very rare that you face hard constraints where technology really doesn't scale.
There's obviously exceptions to the rule.
If you're trying to do something in session or your latency is something like 30 milliseconds and you build complex APIs, their scale is important.
But usually it can be solved one way or another.
What cannot be solved so easily is scale when it comes to people.
And my conviction is this.
I don't want to say what I said elsewhere.
Customers are smart.
And sometimes you got to force customers a little bit to their own luck.
So when you build something like experimentation, I always will bias towards defaulting new releases to on and still turn it off.
If you have a validated vision that has a lot of ingredients, I take a more radical approach where I don't do that motion of going to every customer and trying to convince that customer that it's the right thing and then hoping for the opt-in.
It's a quick switch in the UI where you default to on
Makes it a little bit harder if you really don't want to do experimentation on the platform.