George Bonaci
👤 PersonAppearances Over Time
Podcast Appearances
And then once we were no longer under the gun, we went back and optimized for rigor to understand what worked and what actually didn't.
And then once we were no longer under the gun, we went back and optimized for rigor to understand what worked and what actually didn't.
And then once we were no longer under the gun, we went back and optimized for rigor to understand what worked and what actually didn't.
Having said that, it would be great if you could figure out what some leading indicators are or scope down the experiment so you could get some signal that like, hey, we have confidence this will or won't work. In general, I've always tried to prioritize experiments based on impact and effort. I think those obvious ones, but also confidence in time to results.
Having said that, it would be great if you could figure out what some leading indicators are or scope down the experiment so you could get some signal that like, hey, we have confidence this will or won't work. In general, I've always tried to prioritize experiments based on impact and effort. I think those obvious ones, but also confidence in time to results.
Having said that, it would be great if you could figure out what some leading indicators are or scope down the experiment so you could get some signal that like, hey, we have confidence this will or won't work. In general, I've always tried to prioritize experiments based on impact and effort. I think those obvious ones, but also confidence in time to results.
Like if you're really confident something's going to work, you should just do it. If you're really unconfident that something's going to work, but the time to results is really fast, you should do that as well. And I think usually those two dimensions are lost when people are prioritizing. They tend to just go for impact and effort and not think about confidence or time to results.
Like if you're really confident something's going to work, you should just do it. If you're really unconfident that something's going to work, but the time to results is really fast, you should do that as well. And I think usually those two dimensions are lost when people are prioritizing. They tend to just go for impact and effort and not think about confidence or time to results.
Like if you're really confident something's going to work, you should just do it. If you're really unconfident that something's going to work, but the time to results is really fast, you should do that as well. And I think usually those two dimensions are lost when people are prioritizing. They tend to just go for impact and effort and not think about confidence or time to results.
That's where I think the experimental design and the rigor of experiments comes in. You'll have a hypothesis. And I think most marketers tend to jump to, let's go do something. Let's go launch something. But the experimental design, like understanding what you're actually able to measure, what you will measure, how long it'll take to get that result, if it's statistically significant or not.
That's where I think the experimental design and the rigor of experiments comes in. You'll have a hypothesis. And I think most marketers tend to jump to, let's go do something. Let's go launch something. But the experimental design, like understanding what you're actually able to measure, what you will measure, how long it'll take to get that result, if it's statistically significant or not.
That's where I think the experimental design and the rigor of experiments comes in. You'll have a hypothesis. And I think most marketers tend to jump to, let's go do something. Let's go launch something. But the experimental design, like understanding what you're actually able to measure, what you will measure, how long it'll take to get that result, if it's statistically significant or not.
All of that, honestly, like you either need to be able to do that math yourself or go find someone on a data team and go work with them or acknowledge the fact that, hey, we don't actually have a good way to think about this or measure this. And that's OK. We're going to go collect some qualitative data. And it means we might be wrong.
All of that, honestly, like you either need to be able to do that math yourself or go find someone on a data team and go work with them or acknowledge the fact that, hey, we don't actually have a good way to think about this or measure this. And that's OK. We're going to go collect some qualitative data. And it means we might be wrong.
All of that, honestly, like you either need to be able to do that math yourself or go find someone on a data team and go work with them or acknowledge the fact that, hey, we don't actually have a good way to think about this or measure this. And that's OK. We're going to go collect some qualitative data. And it means we might be wrong.
Folks need to be honest about that upfront and define that upfront is probably how I would think about it.
Folks need to be honest about that upfront and define that upfront is probably how I would think about it.
Folks need to be honest about that upfront and define that upfront is probably how I would think about it.
Yeah, absolutely. You should like triple down on that. I think that another mistake most startups make is they see something that works and they're like, okay, great. Let's increase our spend. Let's double it. Let's triple it. Ideally, if it's working, you take that channel to saturation as quickly as possible.
Yeah, absolutely. You should like triple down on that. I think that another mistake most startups make is they see something that works and they're like, okay, great. Let's increase our spend. Let's double it. Let's triple it. Ideally, if it's working, you take that channel to saturation as quickly as possible.