Amy Edmondson
👤 PersonPodcast Appearances
One way to think about this is we will be failing. So Let's do it joyfully. Let's do it thoughtfully and celebrate them appropriately.
One way to think about this is we will be failing. So Let's do it joyfully. Let's do it thoughtfully and celebrate them appropriately.
In academia, we don't publish our null results. So that means not only do we not spend enough time on them to really learn what they're teaching us, but even more importantly, our colleagues near and far don't get to see them.
In academia, we don't publish our null results. So that means not only do we not spend enough time on them to really learn what they're teaching us, but even more importantly, our colleagues near and far don't get to see them.
So then they're at risk of trying the same thing, which to me is the most wasteful of the wasteful failures is when we already had that knowledge, but somehow we aren't able to share it.
So then they're at risk of trying the same thing, which to me is the most wasteful of the wasteful failures is when we already had that knowledge, but somehow we aren't able to share it.
Yes. And, you know, it's not as strange as it sounds. You could still have very high standards because you wouldn't publish things that were just nonsensical or didn't have thoughtful hypotheses or theories that led you to spend that time studying them.
Yes. And, you know, it's not as strange as it sounds. You could still have very high standards because you wouldn't publish things that were just nonsensical or didn't have thoughtful hypotheses or theories that led you to spend that time studying them.
Let's do it the other way around.
Let's do it the other way around.
I like the idea, though.
I like the idea, though.
Number one, distinguishing different kinds of failure. A failure is not a failure is not a failure. You know, we could be talking about a little mistake. We could be talking about a catastrophic accident. We could be talking about a scientific hypothesis that didn't get supported. So providing the students that useful terminology and that useful clarity.
Number one, distinguishing different kinds of failure. A failure is not a failure is not a failure. You know, we could be talking about a little mistake. We could be talking about a catastrophic accident. We could be talking about a scientific hypothesis that didn't get supported. So providing the students that useful terminology and that useful clarity.
And then I think a second element that I'd love to see in the course is understanding experimentation best practices? You know, how do you think about good experiments versus not good experiments?
And then I think a second element that I'd love to see in the course is understanding experimentation best practices? You know, how do you think about good experiments versus not good experiments?
Oh, no. No.
Oh, no. No.
See, that's an error-driven, preventable, adverse drug event.
See, that's an error-driven, preventable, adverse drug event.
You know, there will always be things that go wrong or at least not the way we wanted them to. And my observation in studying teams in a variety of industries and settings was that responses to failure were rather uniform. inappropriately uniform.
You know, there will always be things that go wrong or at least not the way we wanted them to. And my observation in studying teams in a variety of industries and settings was that responses to failure were rather uniform. inappropriately uniform.
The natural response and even the formal response was to find the culprit as if there was a culprit and either discipline or retrain or, you know, shame and blame the culprit. And it wasn't a very effective solution because the only way to prevent those kinds of system breakdowns is to be highly vigilant to how little things can line up and produce failures.
The natural response and even the formal response was to find the culprit as if there was a culprit and either discipline or retrain or, you know, shame and blame the culprit. And it wasn't a very effective solution because the only way to prevent those kinds of system breakdowns is to be highly vigilant to how little things can line up and produce failures.
My spectrum of causes of failure is
My spectrum of causes of failure is
Right, I never had an A-. Well, you know, I once had one in 10th grade. It just was so devastating, I resolved not to have one again. And I'm only partly joking. But then she went to college. I got an F on my first semester multivariable calculus exam. An F. Like, I failed the exam. I mean, that's unheard of. What'd that feel like? I didn't see it coming, but I wasn't baffled after the fact.
Right, I never had an A-. Well, you know, I once had one in 10th grade. It just was so devastating, I resolved not to have one again. And I'm only partly joking. But then she went to college. I got an F on my first semester multivariable calculus exam. An F. Like, I failed the exam. I mean, that's unheard of. What'd that feel like? I didn't see it coming, but I wasn't baffled after the fact.
After the fact, it was very clear to me that I hadn't studied enough.
After the fact, it was very clear to me that I hadn't studied enough.
Let's take two extremes. Let's say something goes wrong. We achieve an undesired result. On one end of the spectrum, it's sabotage. Someone literally tanked the process. They threw a wrench into the works. On the other end of the spectrum, we have a scientist or an engineer hypothesizing some new tweak that might solve a really important problem, and they try it. And it fails.
Let's take two extremes. Let's say something goes wrong. We achieve an undesired result. On one end of the spectrum, it's sabotage. Someone literally tanked the process. They threw a wrench into the works. On the other end of the spectrum, we have a scientist or an engineer hypothesizing some new tweak that might solve a really important problem, and they try it. And it fails.
And, of course, we praise the scientist and we punish the saboteur. But the gradations in between often lull us into a false sense that it's blameworthy all the way.
And, of course, we praise the scientist and we punish the saboteur. But the gradations in between often lull us into a false sense that it's blameworthy all the way.
My spectrum of causes of failures starts with sabotage or deviance. I soak a rag in lighter fluid, set it on fire, throw it into a building, right? Or I'm a physician in a hospital, I'm a surgeon, and I come to work drunk and do an operation.
My spectrum of causes of failures starts with sabotage or deviance. I soak a rag in lighter fluid, set it on fire, throw it into a building, right? Or I'm a physician in a hospital, I'm a surgeon, and I come to work drunk and do an operation.
That's right. There has to be intent here. To label something a true sabotage, it has to be, my intent is to break something. It's not a mistake, and it's not a thoughtful experiment. There certainly are protocols in hospitals, for example, where Thoughtful physicians will deliberately depart from the protocol because their clinical judgment suggests that would be better.
That's right. There has to be intent here. To label something a true sabotage, it has to be, my intent is to break something. It's not a mistake, and it's not a thoughtful experiment. There certainly are protocols in hospitals, for example, where Thoughtful physicians will deliberately depart from the protocol because their clinical judgment suggests that would be better.
They may be right, they may be wrong, but that would not qualify as a blameworthy act.
They may be right, they may be wrong, but that would not qualify as a blameworthy act.
Inattention is when something goes wrong because you just were mailing it in, you spaced out, you didn't hear what someone said and you didn't ask and then you just tried to wing it. Or you maybe are driving, you're a trucker and you're driving and you look away or fiddle with the radio and have a car crash.
Inattention is when something goes wrong because you just were mailing it in, you spaced out, you didn't hear what someone said and you didn't ask and then you just tried to wing it. Or you maybe are driving, you're a trucker and you're driving and you look away or fiddle with the radio and have a car crash.
Well, that's exactly right. Once we leave sabotage and move to the right in the spectrum, it will never be immediately obvious whether something's blameworthy or not. It's always going to need further analysis. So when we say the failure was caused by someone not paying attention, that just brings up more questions. Okay, why weren't they paying attention? Now, it could be that this
Well, that's exactly right. Once we leave sabotage and move to the right in the spectrum, it will never be immediately obvious whether something's blameworthy or not. It's always going to need further analysis. So when we say the failure was caused by someone not paying attention, that just brings up more questions. Okay, why weren't they paying attention? Now, it could be that this
poor nurse was on a double shift, and that is not necessarily the nurse's fault. It might be the nurse manager who assigned that double shift, or it might be the fact that someone else didn't show up, and so they have to just do it, and they're quite literally too tired to pay attention fully, right?
poor nurse was on a double shift, and that is not necessarily the nurse's fault. It might be the nurse manager who assigned that double shift, or it might be the fact that someone else didn't show up, and so they have to just do it, and they're quite literally too tired to pay attention fully, right?
So we always want to say, well, wait, let's see, what are the other contributing factors to this inattention?
So we always want to say, well, wait, let's see, what are the other contributing factors to this inattention?
Yes. One that comes to mind is a devastating collapse with the loss of many lives when a Hyatt Regency atrium collapsed in Kansas City in the early 80s. And the inattention there was the engineer on records, failure to pay close attention when the builder decided, out loud, not hidden, to swap one long beam for two smaller connected steel beams.
Yes. One that comes to mind is a devastating collapse with the loss of many lives when a Hyatt Regency atrium collapsed in Kansas City in the early 80s. And the inattention there was the engineer on records, failure to pay close attention when the builder decided, out loud, not hidden, to swap one long beam for two smaller connected steel beams.
It would have been a five-minute calculation to show that won't work with the loads that were expected. It was a change that didn't obtain the attention it needed to have avoided failure. this catastrophic failure.
It would have been a five-minute calculation to show that won't work with the loads that were expected. It was a change that didn't obtain the attention it needed to have avoided failure. this catastrophic failure.
I think it was a combination of speed and money. Speed is money.
I think it was a combination of speed and money. Speed is money.
That's right. And that spans from a young child who doesn't yet know how to ride a bicycle. So as soon as they hop on that bicycle, they're going to fall off because they don't have the ability yet to, you know, multivariable calculus, which at least when you're not studying hard enough, you don't have the ability.
That's right. And that spans from a young child who doesn't yet know how to ride a bicycle. So as soon as they hop on that bicycle, they're going to fall off because they don't have the ability yet to, you know, multivariable calculus, which at least when you're not studying hard enough, you don't have the ability.
So it's something that you just don't have the ability to do to success, but usually could develop.
So it's something that you just don't have the ability to do to success, but usually could develop.
That's a great connection. Yeah, the Peter Principle where the failure gets caused by the fact that you don't have the ability to do the new role, but no one really paused to reflect on that.
That's a great connection. Yeah, the Peter Principle where the failure gets caused by the fact that you don't have the ability to do the new role, but no one really paused to reflect on that.
But I think it might be increasingly the case. There's no theoretical reason why the two abilities to be compelling and win people over to your point of view should be at odds with the capability to do it. But the way it is increasingly set up in our society might be putting them at odds.
But I think it might be increasingly the case. There's no theoretical reason why the two abilities to be compelling and win people over to your point of view should be at odds with the capability to do it. But the way it is increasingly set up in our society might be putting them at odds.
Yes. The task is too challenging for reliable failure-free performance. Example? A great example is an Olympic gymnast who is training all the time and is able to do some of the most challenging maneuvers, but will not do them 100% of the time. And so when that person is
Yes. The task is too challenging for reliable failure-free performance. Example? A great example is an Olympic gymnast who is training all the time and is able to do some of the most challenging maneuvers, but will not do them 100% of the time. And so when that person is
experiences a failure, they trip during their routine, then we would call that a failure that was largely caused by the inherent challenge of the task. Can you give an example in either the corporate or maybe academic realm? Let's go to NASA, for example. The shuttle program is very, very challenging. I think we can all agree to that. And over time, they started to think of it as not challenging.
experiences a failure, they trip during their routine, then we would call that a failure that was largely caused by the inherent challenge of the task. Can you give an example in either the corporate or maybe academic realm? Let's go to NASA, for example. The shuttle program is very, very challenging. I think we can all agree to that. And over time, they started to think of it as not challenging.
But really, it's a remarkably challenging thing to send a rocket into space and bring it back safely.
But really, it's a remarkably challenging thing to send a rocket into space and bring it back safely.
That's a good point. Actually, I love Richard Feynman looking back on the Challenger accident, his sort of simple willingness to just put the piece of O-ring in the ice water, see what happens, right? That's something that in a better run, more psychologically safe, more creative way. generative work environment someone else would have done in real time.
That's a good point. Actually, I love Richard Feynman looking back on the Challenger accident, his sort of simple willingness to just put the piece of O-ring in the ice water, see what happens, right? That's something that in a better run, more psychologically safe, more creative way. generative work environment someone else would have done in real time.
That's right. But that's I mean, that's not a good thing. That's not a good thing. You've got to learn from it so that it doesn't happen again.
That's right. But that's I mean, that's not a good thing. That's not a good thing. You've got to learn from it so that it doesn't happen again.
So uncertainty is everywhere. There's probably, you know, an infinite number of examples here, but let me pick a silly one. A friend sets you up on a blind date and you like the friend and you think, okay, sure. And then you go out on the date and it's a terrible bore or worse, right? It's a failure. But you couldn't have known in advance. It was uncertain. How about a less silly example?
So uncertainty is everywhere. There's probably, you know, an infinite number of examples here, but let me pick a silly one. A friend sets you up on a blind date and you like the friend and you think, okay, sure. And then you go out on the date and it's a terrible bore or worse, right? It's a failure. But you couldn't have known in advance. It was uncertain. How about a less silly example?
You're in a company setting. You have an idea for a strategic shift or a product that you could launch. And there's very good reasons to believe this could work, but it's not 100 percent.
You're in a company setting. You have an idea for a strategic shift or a product that you could launch. And there's very good reasons to believe this could work, but it's not 100 percent.
I'm being fairly formal when I say experimentation, right? The most obvious example is a scientist in a lab and probably really believes it will work and puts the chemicals in and lo and behold, it fails. Or in much smaller scale, I'm going to experiment with being more assertive in my next meeting and doesn't quite work out the way I'd hoped.
I'm being fairly formal when I say experimentation, right? The most obvious example is a scientist in a lab and probably really believes it will work and puts the chemicals in and lo and behold, it fails. Or in much smaller scale, I'm going to experiment with being more assertive in my next meeting and doesn't quite work out the way I'd hoped.
It's the Edison quote, you know, 10,000 ways that didn't work. He's perfectly... perfectly willing to share that because he's proud of each and every one of those 10,000 experiments.
It's the Edison quote, you know, 10,000 ways that didn't work. He's perfectly... perfectly willing to share that because he's proud of each and every one of those 10,000 experiments.
I absolutely share that worry. And that case was, in my mind, a classic case of a complex failure. Yes, there was a human error. We also had faulty medication labeling and storing practices with alphabetical organization of drugs, which is not how you do it.
I absolutely share that worry. And that case was, in my mind, a classic case of a complex failure. Yes, there was a human error. We also had faulty medication labeling and storing practices with alphabetical organization of drugs, which is not how you do it.
You know, you don't have a dangerous, potentially fatal drug next to one that's routinely used in a particular procedure. It's what we might call an accident waiting to happen. With that perspective in mind, Redonda is as much a victim of a system failure as a perpetuator of the failure, right? So this reaction, human error is almost never criminal.
You know, you don't have a dangerous, potentially fatal drug next to one that's routinely used in a particular procedure. It's what we might call an accident waiting to happen. With that perspective in mind, Redonda is as much a victim of a system failure as a perpetuator of the failure, right? So this reaction, human error is almost never criminal.
To criminalize this, I think, reflects an erroneous belief that By doing so, we'll preclude human error. No, what we will do is preclude speaking up about human error. And to her credit, she spoke up. And that, one could argue, ultimately led her to the conviction she would have been better off somehow trying to hide it, which I wouldn't advocate, obviously.
To criminalize this, I think, reflects an erroneous belief that By doing so, we'll preclude human error. No, what we will do is preclude speaking up about human error. And to her credit, she spoke up. And that, one could argue, ultimately led her to the conviction she would have been better off somehow trying to hide it, which I wouldn't advocate, obviously.
But when we recognize, deeply recognize, that errors will happen— then that means that what excellence looks like is catching and correcting errors. And then being forever on the lookout for vulnerabilities in our systems
But when we recognize, deeply recognize, that errors will happen— then that means that what excellence looks like is catching and correcting errors. And then being forever on the lookout for vulnerabilities in our systems
As an undergraduate, I studied engineering sciences and design.
As an undergraduate, I studied engineering sciences and design.
Yeah, so I'm answering that question with a huge smile on my face. I worked three years for Buckminster Fuller, who was an octogenarian, creative person, an inventor, a genius, a writer, a teacher, best known for the geodesic dome, which he invented, but single-mindedly about engineering. How do we use design to make a better world? You can't sort of get people to change.
Yeah, so I'm answering that question with a huge smile on my face. I worked three years for Buckminster Fuller, who was an octogenarian, creative person, an inventor, a genius, a writer, a teacher, best known for the geodesic dome, which he invented, but single-mindedly about engineering. How do we use design to make a better world? You can't sort of get people to change.
You have to change the environment, and then they'll change with it, was a kind of notion that he had. My part was just doing engineering drawings and building models and doing the mathematics behind new, simpler geodesic configurations, and it was so much fun.
You have to change the environment, and then they'll change with it, was a kind of notion that he had. My part was just doing engineering drawings and building models and doing the mathematics behind new, simpler geodesic configurations, and it was so much fun.
Oh, yeah. He was a very enthusiastic proponent of using failure to learn. He said often, the only mistake we make is thinking we shouldn't make mistakes. He would give the example of the very first time he got a group of students together to build a geodesic dome that he had, you know, he'd done the math, he'd come up with this idea, and he got, you know, 20 students together. They're outside.
Oh, yeah. He was a very enthusiastic proponent of using failure to learn. He said often, the only mistake we make is thinking we shouldn't make mistakes. He would give the example of the very first time he got a group of students together to build a geodesic dome that he had, you know, he'd done the math, he'd come up with this idea, and he got, you know, 20 students together. They're outside.
They built the thing, and it immediately collapsed.
They built the thing, and it immediately collapsed.
And he enthusiastically said, okay, that didn't work. Now, what went wrong? And it was really the materials they were using, which were, I think, the best way to describe them as Venetian blind materials. They had the tensile strength, but they certainly didn't have the compressive strength to do their job.
And he enthusiastically said, okay, that didn't work. Now, what went wrong? And it was really the materials they were using, which were, I think, the best way to describe them as Venetian blind materials. They had the tensile strength, but they certainly didn't have the compressive strength to do their job.
Immediate diagnosis, right? We step back. OK, what do we set out to do? What actually happened? Why might that be the case? What do we do differently next time? I mean, that's a sort of a rough outline of an after action review. It could be flawed assumptions. It could be flawed calculations. It could be any number of things.
Immediate diagnosis, right? We step back. OK, what do we set out to do? What actually happened? Why might that be the case? What do we do differently next time? I mean, that's a sort of a rough outline of an after action review. It could be flawed assumptions. It could be flawed calculations. It could be any number of things.
And we don't know until we put our heads together and try to figure it out.
And we don't know until we put our heads together and try to figure it out.
I was interested in learning in organizations, and I got invited to be a member of a large team studying medication errors in hospitals. And the reason I said yes was, first of all, I was a first-year graduate student. I needed to do something. And second of all, I saw a very obvious link between mistakes and learning.
I was interested in learning in organizations, and I got invited to be a member of a large team studying medication errors in hospitals. And the reason I said yes was, first of all, I was a first-year graduate student. I needed to do something. And second of all, I saw a very obvious link between mistakes and learning.
And so I thought, here we've got these really smart people who will be identifying mistakes And then I can look at how do people learn from them and how easy is it and how hard is it? So that's how I got in there. And then one thing led to another. After doing that study, people kept inviting me back.
And so I thought, here we've got these really smart people who will be identifying mistakes And then I can look at how do people learn from them and how easy is it and how hard is it? So that's how I got in there. And then one thing led to another. After doing that study, people kept inviting me back.
That's right.
That's right.
Now, you can divide adverse drug events into two categories, one which is related to some kind of human error or system breakdown, and the other which is a previously unknown allergy, so literally couldn't have been predicted. And those are still adverse drug events, but they're not called preventable adverse drug events.
Now, you can divide adverse drug events into two categories, one which is related to some kind of human error or system breakdown, and the other which is a previously unknown allergy, so literally couldn't have been predicted. And those are still adverse drug events, but they're not called preventable adverse drug events.
On and on it goes. Or, you know, using language badly so that people didn't understand what you said and they didn't feel safe asking.
On and on it goes. Or, you know, using language badly so that people didn't understand what you said and they didn't feel safe asking.
I want to be broad. Let's start broad. Like a failure is something undesired that happens. And a failure-free life is not a possibility. One way to think about this is we will be failing. So... Let's do it joyfully. Let's do it thoughtfully and celebrate them appropriately.
I want to be broad. Let's start broad. Like a failure is something undesired that happens. And a failure-free life is not a possibility. One way to think about this is we will be failing. So... Let's do it joyfully. Let's do it thoughtfully and celebrate them appropriately.
Most, not all, but most professional failures have an element of the personal in them. It might be that we didn't put enough effort into it or we missed signals that we probably should have been paying attention to or we discounted someone else's perspective as less valid than our own. Most of the time, there is a personal or human, occasionally character contributor to the failure.
Most, not all, but most professional failures have an element of the personal in them. It might be that we didn't put enough effort into it or we missed signals that we probably should have been paying attention to or we discounted someone else's perspective as less valid than our own. Most of the time, there is a personal or human, occasionally character contributor to the failure.
So it's hard to separate the professional and the personal.
So it's hard to separate the professional and the personal.
When you refer to personal life, that is one of those domains where there's no right answer. So when I say I very likely made many decisions not to be at a Little League game where I could have been there. You know, largely because of work demands that seemed, you know, seemed too important to not focus on. And then, you know, what's the net result of that?
When you refer to personal life, that is one of those domains where there's no right answer. So when I say I very likely made many decisions not to be at a Little League game where I could have been there. You know, largely because of work demands that seemed, you know, seemed too important to not focus on. And then, you know, what's the net result of that?
It may not be, you know, it may be some of it's bad, some of it's good. Maybe my sons felt that I didn't care. Maybe they didn't become professional baseball players, which is true. That is factually true.
It may not be, you know, it may be some of it's bad, some of it's good. Maybe my sons felt that I didn't care. Maybe they didn't become professional baseball players, which is true. That is factually true.
Exactly. Statistically improbable anyway. And one thing I did, not by design, but inadvertently leave them with is a model of loving your work in a way that it's just engaging and you sort of can't stop thinking about it. So I don't feel too bad about that, but we might need to interview them to know for sure.
Exactly. Statistically improbable anyway. And one thing I did, not by design, but inadvertently leave them with is a model of loving your work in a way that it's just engaging and you sort of can't stop thinking about it. So I don't feel too bad about that, but we might need to interview them to know for sure.
Some failures are objective. When the shuttle implodes upon reentry into the Earth's atmosphere, that is a failure and there's no disagreement about it. But failures in the personal realm or work-life balance realm are utterly subjective. We are societally very likely to see it differently based on gender, based on mother or father. And we know this, right?
Some failures are objective. When the shuttle implodes upon reentry into the Earth's atmosphere, that is a failure and there's no disagreement about it. But failures in the personal realm or work-life balance realm are utterly subjective. We are societally very likely to see it differently based on gender, based on mother or father. And we know this, right?
Something that is seen as a success, you know, a successful or appropriate or positive behavior for a father can be coded very differently for a mother.
Something that is seen as a success, you know, a successful or appropriate or positive behavior for a father can be coded very differently for a mother.
We don't have what you would call scientific evidence. I have plenty of anecdotal evidence in the classroom and also a theory. Okay, let's have it. So this is the unequal license to fail. And that can make, and I think does make, women more risk-averse, you know, in boardrooms and classrooms alike.
We don't have what you would call scientific evidence. I have plenty of anecdotal evidence in the classroom and also a theory. Okay, let's have it. So this is the unequal license to fail. And that can make, and I think does make, women more risk-averse, you know, in boardrooms and classrooms alike.
In my classroom, I have noticed over the years that women are substantially less likely to raise their hand with a mediocre comment. They put their own threshold higher. And I think of a classroom, and I try to convey this very clearly to my students, as a laboratory, right? As a place where here's where we can make mistakes So we don't make them out there.
In my classroom, I have noticed over the years that women are substantially less likely to raise their hand with a mediocre comment. They put their own threshold higher. And I think of a classroom, and I try to convey this very clearly to my students, as a laboratory, right? As a place where here's where we can make mistakes So we don't make them out there.
The whole point of a classroom is to take risks, to get things wrong along the way to getting them right. Now, I understand it is a very social context and they want to be seen well in the eyes of others. But consistently, women act as if they're more risk-averse. They don't raise their hand.
The whole point of a classroom is to take risks, to get things wrong along the way to getting them right. Now, I understand it is a very social context and they want to be seen well in the eyes of others. But consistently, women act as if they're more risk-averse. They don't raise their hand.
And then they'll tell me that in my office, too, that they don't want to raise their hand unless they know it's a really good comment. And men seem to be less inhibited.
And then they'll tell me that in my office, too, that they don't want to raise their hand unless they know it's a really good comment. And men seem to be less inhibited.
I just went from the blameworthy end all the way over to the praiseworthy end.
I just went from the blameworthy end all the way over to the praiseworthy end.
I meant to prepare a great deal more than I have. Uh-oh. I guess I've been preparing 30 years, so... Stitcher.
I meant to prepare a great deal more than I have. Uh-oh. I guess I've been preparing 30 years, so... Stitcher.
The natural tendency is just to look at what they call in hospitals the sharp end, the last person, the person at the bedside who administered that drug. But in fact, the chain of events goes back to the pharmacy and even to the IT folks who printed the label in a weird way.
The natural tendency is just to look at what they call in hospitals the sharp end, the last person, the person at the bedside who administered that drug. But in fact, the chain of events goes back to the pharmacy and even to the IT folks who printed the label in a weird way.
Many times you have failures in organizations simply because one silo doesn't know what the other silo is doing. So these are learning events. One big reason we don't learn enough from failures is that we don't share them systematically enough.
Many times you have failures in organizations simply because one silo doesn't know what the other silo is doing. So these are learning events. One big reason we don't learn enough from failures is that we don't share them systematically enough.
I'm very happy to be called that. It seems like an upgrade. I became a scholar of failure because I wanted to be a scholar of organizational learning. So I came to graduate school with the idea, unformed, that organizations need to keep changing to stay relevant in a world that keeps changing. And they didn't seem to be very good at it.
I'm very happy to be called that. It seems like an upgrade. I became a scholar of failure because I wanted to be a scholar of organizational learning. So I came to graduate school with the idea, unformed, that organizations need to keep changing to stay relevant in a world that keeps changing. And they didn't seem to be very good at it.
I haven't met anyone who feels really good about failure, myself included. You have to force yourself to feel good about failure. And why do you think that is? I think it's our upbringing, right? By the time you're in elementary school, there's such a strong emphasis on getting the right answer or succeeding, not failing. And so...
I haven't met anyone who feels really good about failure, myself included. You have to force yourself to feel good about failure. And why do you think that is? I think it's our upbringing, right? By the time you're in elementary school, there's such a strong emphasis on getting the right answer or succeeding, not failing. And so...
We're not trained very well in the whole idea of uncertainty or novelty.
We're not trained very well in the whole idea of uncertainty or novelty.
Sure. I think of them as emotional, cognitive, and social. So emotionally, we're just spontaneously averse to failure, right? I don't like it. I don't want to have it. I don't want to look at it, right? It's immediate. Cognitively, because we don't do a good job or don't have access to a simple framework to distinguish among kinds of failures, we then sort of decide to not like any of them.
Sure. I think of them as emotional, cognitive, and social. So emotionally, we're just spontaneously averse to failure, right? I don't like it. I don't want to have it. I don't want to look at it, right? It's immediate. Cognitively, because we don't do a good job or don't have access to a simple framework to distinguish among kinds of failures, we then sort of decide to not like any of them.
And the fear part has to do with our concerns, very deep and deeply founded concerns of what other people think of us. So we don't want to be seen as having shortcomings. We don't want to be seen as associated with a failure.
And the fear part has to do with our concerns, very deep and deeply founded concerns of what other people think of us. So we don't want to be seen as having shortcomings. We don't want to be seen as associated with a failure.
We're good at failing. I mean, we are, by definition, fallible human beings, each and every one of us. And we will have failures. You know, the only real question is, how bad do we have to feel about it?
We're good at failing. I mean, we are, by definition, fallible human beings, each and every one of us. And we will have failures. You know, the only real question is, how bad do we have to feel about it?
They're inadequate. When you say fail fast, fail often, big smile on your face, you know, most people, oh, yeah, I get it. I see innovation, blah, blah, blah. But at a deeper level, wait a minute. You know, failure is not good, right? I don't want a failure and I don't want to fail. So I'll pretend I agree with that. But in reality, no way. It's just wrong. Failure is bad.
They're inadequate. When you say fail fast, fail often, big smile on your face, you know, most people, oh, yeah, I get it. I see innovation, blah, blah, blah. But at a deeper level, wait a minute. You know, failure is not good, right? I don't want a failure and I don't want to fail. So I'll pretend I agree with that. But in reality, no way. It's just wrong. Failure is bad.
But you know what I'm talking about. Slogans aren't enough. You know, slogans don't get you to the behavioral changes you need to make.
But you know what I'm talking about. Slogans aren't enough. You know, slogans don't get you to the behavioral changes you need to make.
There's no objective criteria that are going to announce themselves to say go right, go left. So you're going to have to make a judgment.
There's no objective criteria that are going to announce themselves to say go right, go left. So you're going to have to make a judgment.
If you're a child learning to ride a bicycle, please don't quit. If you're someone who thinks this particular paper is the best thing ever published and every single journal rejects it, there does come a point where it's probably worth quitting.
If you're a child learning to ride a bicycle, please don't quit. If you're someone who thinks this particular paper is the best thing ever published and every single journal rejects it, there does come a point where it's probably worth quitting.
One way to think about this is we will be failing. So Let's do it joyfully. Let's do it thoughtfully and celebrate them appropriately.
In academia, we don't publish our null results. So that means not only do we not spend enough time on them to really learn what they're teaching us, but even more importantly, our colleagues near and far don't get to see them.
So then they're at risk of trying the same thing, which to me is the most wasteful of the wasteful failures is when we already had that knowledge, but somehow we aren't able to share it.
Yes. And, you know, it's not as strange as it sounds. You could still have very high standards because you wouldn't publish things that were just nonsensical or didn't have thoughtful hypotheses or theories that led you to spend that time studying them.
Let's do it the other way around.
I like the idea, though.
Number one, distinguishing different kinds of failure. A failure is not a failure is not a failure. You know, we could be talking about a little mistake. We could be talking about a catastrophic accident. We could be talking about a scientific hypothesis that didn't get supported. So providing the students that useful terminology and that useful clarity.
And then I think a second element that I'd love to see in the course is understanding experimentation best practices? You know, how do you think about good experiments versus not good experiments?
There's no objective criteria that are going to announce themselves to say go right, go left. So you're going to have to make a judgment.
If you're a child learning to ride a bicycle, please don't quit. If you're someone who thinks this particular paper is the best thing ever published and every single journal rejects it, there does come a point where it's probably worth quitting.
Oh, no. No.
See, that's an error-driven, preventable, adverse drug event.
You know, there will always be things that go wrong or at least not the way we wanted them to. And my observation in studying teams in a variety of industries and settings was that responses to failure were rather uniform. inappropriately uniform.
The natural response and even the formal response was to find the culprit as if there was a culprit and either discipline or retrain or, you know, shame and blame the culprit. And it wasn't a very effective solution because the only way to prevent those kinds of system breakdowns is to be highly vigilant to how little things can line up and produce failures.
My spectrum of causes of failure is
Right, I never had an A-. Well, you know, I once had one in 10th grade. It just was so devastating, I resolved not to have one again. And I'm only partly joking. But then she went to college. I got an F on my first semester multivariable calculus exam. An F. Like, I failed the exam. I mean, that's unheard of. What'd that feel like? I didn't see it coming, but I wasn't baffled after the fact.
After the fact, it was very clear to me that I hadn't studied enough.
Let's take two extremes. Let's say something goes wrong. We achieve an undesired result. On one end of the spectrum, it's sabotage. Someone literally tanked the process. They threw a wrench into the works. On the other end of the spectrum, we have a scientist or an engineer hypothesizing some new tweak that might solve a really important problem, and they try it. And it fails.
And, of course, we praise the scientist and we punish the saboteur. But the gradations in between often lull us into a false sense that it's blameworthy all the way.
My spectrum of causes of failures starts with sabotage or deviance. I soak a rag in lighter fluid, set it on fire, throw it into a building, right? Or I'm a physician in a hospital, I'm a surgeon, and I come to work drunk and do an operation.
That's right. There has to be intent here. To label something a true sabotage, it has to be, my intent is to break something. It's not a mistake, and it's not a thoughtful experiment. There certainly are protocols in hospitals, for example, where Thoughtful physicians will deliberately depart from the protocol because their clinical judgment suggests that would be better.
They may be right, they may be wrong, but that would not qualify as a blameworthy act.
Inattention is when something goes wrong because you just were mailing it in, you spaced out, you didn't hear what someone said and you didn't ask and then you just tried to wing it. Or you maybe are driving, you're a trucker and you're driving and you look away or fiddle with the radio and have a car crash.
Well, that's exactly right. Once we leave sabotage and move to the right in the spectrum, it will never be immediately obvious whether something's blameworthy or not. It's always going to need further analysis. So when we say the failure was caused by someone not paying attention, that just brings up more questions. Okay, why weren't they paying attention? Now, it could be that this
poor nurse was on a double shift, and that is not necessarily the nurse's fault. It might be the nurse manager who assigned that double shift, or it might be the fact that someone else didn't show up, and so they have to just do it, and they're quite literally too tired to pay attention fully, right?
So we always want to say, well, wait, let's see, what are the other contributing factors to this inattention?
Yes. One that comes to mind is a devastating collapse with the loss of many lives when a Hyatt Regency atrium collapsed in Kansas City in the early 80s. And the inattention there was the engineer on records, failure to pay close attention when the builder decided, out loud, not hidden, to swap one long beam for two smaller connected steel beams.
It would have been a five-minute calculation to show that won't work with the loads that were expected. It was a change that didn't obtain the attention it needed to have avoided failure. this catastrophic failure.
I think it was a combination of speed and money. Speed is money.
That's right. And that spans from a young child who doesn't yet know how to ride a bicycle. So as soon as they hop on that bicycle, they're going to fall off because they don't have the ability yet to, you know, multivariable calculus, which at least when you're not studying hard enough, you don't have the ability.
So it's something that you just don't have the ability to do to success, but usually could develop.
That's a great connection. Yeah, the Peter Principle where the failure gets caused by the fact that you don't have the ability to do the new role, but no one really paused to reflect on that.
But I think it might be increasingly the case. There's no theoretical reason why the two abilities to be compelling and win people over to your point of view should be at odds with the capability to do it. But the way it is increasingly set up in our society might be putting them at odds.
Yes. The task is too challenging for reliable failure-free performance. Example? A great example is an Olympic gymnast who is training all the time and is able to do some of the most challenging maneuvers, but will not do them 100% of the time. And so when that person is
experiences a failure, they trip during their routine, then we would call that a failure that was largely caused by the inherent challenge of the task. Can you give an example in either the corporate or maybe academic realm? Let's go to NASA, for example. The shuttle program is very, very challenging. I think we can all agree to that. And over time, they started to think of it as not challenging.
But really, it's a remarkably challenging thing to send a rocket into space and bring it back safely.
That's a good point. Actually, I love Richard Feynman looking back on the Challenger accident, his sort of simple willingness to just put the piece of O-ring in the ice water, see what happens, right? That's something that in a better run, more psychologically safe, more creative way. generative work environment someone else would have done in real time.
That's right. But that's I mean, that's not a good thing. That's not a good thing. You've got to learn from it so that it doesn't happen again.
So uncertainty is everywhere. There's probably, you know, an infinite number of examples here, but let me pick a silly one. A friend sets you up on a blind date and you like the friend and you think, okay, sure. And then you go out on the date and it's a terrible bore or worse, right? It's a failure. But you couldn't have known in advance. It was uncertain. How about a less silly example?
You're in a company setting. You have an idea for a strategic shift or a product that you could launch. And there's very good reasons to believe this could work, but it's not 100 percent.
I'm being fairly formal when I say experimentation, right? The most obvious example is a scientist in a lab and probably really believes it will work and puts the chemicals in and lo and behold, it fails. Or in much smaller scale, I'm going to experiment with being more assertive in my next meeting and doesn't quite work out the way I'd hoped.
It's the Edison quote, you know, 10,000 ways that didn't work. He's perfectly... perfectly willing to share that because he's proud of each and every one of those 10,000 experiments.
I absolutely share that worry. And that case was, in my mind, a classic case of a complex failure. Yes, there was a human error. We also had faulty medication labeling and storing practices with alphabetical organization of drugs, which is not how you do it.
You know, you don't have a dangerous, potentially fatal drug next to one that's routinely used in a particular procedure. It's what we might call an accident waiting to happen. With that perspective in mind, Redonda is as much a victim of a system failure as a perpetuator of the failure, right? So this reaction, human error is almost never criminal.
To criminalize this, I think, reflects an erroneous belief that By doing so, we'll preclude human error. No, what we will do is preclude speaking up about human error. And to her credit, she spoke up. And that, one could argue, ultimately led her to the conviction she would have been better off somehow trying to hide it, which I wouldn't advocate, obviously.
But when we recognize, deeply recognize, that errors will happen— then that means that what excellence looks like is catching and correcting errors. And then being forever on the lookout for vulnerabilities in our systems
As an undergraduate, I studied engineering sciences and design.
Yeah, so I'm answering that question with a huge smile on my face. I worked three years for Buckminster Fuller, who was an octogenarian, creative person, an inventor, a genius, a writer, a teacher, best known for the geodesic dome, which he invented, but single-mindedly about engineering. How do we use design to make a better world? You can't sort of get people to change.
You have to change the environment, and then they'll change with it, was a kind of notion that he had. My part was just doing engineering drawings and building models and doing the mathematics behind new, simpler geodesic configurations, and it was so much fun.
Oh, yeah. He was a very enthusiastic proponent of using failure to learn. He said often, the only mistake we make is thinking we shouldn't make mistakes. He would give the example of the very first time he got a group of students together to build a geodesic dome that he had, you know, he'd done the math, he'd come up with this idea, and he got, you know, 20 students together. They're outside.
They built the thing, and it immediately collapsed.
And he enthusiastically said, okay, that didn't work. Now, what went wrong? And it was really the materials they were using, which were, I think, the best way to describe them as Venetian blind materials. They had the tensile strength, but they certainly didn't have the compressive strength to do their job.
Immediate diagnosis, right? We step back. OK, what do we set out to do? What actually happened? Why might that be the case? What do we do differently next time? I mean, that's a sort of a rough outline of an after action review. It could be flawed assumptions. It could be flawed calculations. It could be any number of things.
And we don't know until we put our heads together and try to figure it out.
I was interested in learning in organizations, and I got invited to be a member of a large team studying medication errors in hospitals. And the reason I said yes was, first of all, I was a first-year graduate student. I needed to do something. And second of all, I saw a very obvious link between mistakes and learning.
And so I thought, here we've got these really smart people who will be identifying mistakes And then I can look at how do people learn from them and how easy is it and how hard is it? So that's how I got in there. And then one thing led to another. After doing that study, people kept inviting me back.
That's right.
Now, you can divide adverse drug events into two categories, one which is related to some kind of human error or system breakdown, and the other which is a previously unknown allergy, so literally couldn't have been predicted. And those are still adverse drug events, but they're not called preventable adverse drug events.
On and on it goes. Or, you know, using language badly so that people didn't understand what you said and they didn't feel safe asking.
I want to be broad. Let's start broad. Like a failure is something undesired that happens. And a failure-free life is not a possibility. One way to think about this is we will be failing. So... Let's do it joyfully. Let's do it thoughtfully and celebrate them appropriately.
Most, not all, but most professional failures have an element of the personal in them. It might be that we didn't put enough effort into it or we missed signals that we probably should have been paying attention to or we discounted someone else's perspective as less valid than our own. Most of the time, there is a personal or human, occasionally character contributor to the failure.
So it's hard to separate the professional and the personal.
When you refer to personal life, that is one of those domains where there's no right answer. So when I say I very likely made many decisions not to be at a Little League game where I could have been there. You know, largely because of work demands that seemed, you know, seemed too important to not focus on. And then, you know, what's the net result of that?
It may not be, you know, it may be some of it's bad, some of it's good. Maybe my sons felt that I didn't care. Maybe they didn't become professional baseball players, which is true. That is factually true.
Exactly. Statistically improbable anyway. And one thing I did, not by design, but inadvertently leave them with is a model of loving your work in a way that it's just engaging and you sort of can't stop thinking about it. So I don't feel too bad about that, but we might need to interview them to know for sure.
Some failures are objective. When the shuttle implodes upon reentry into the Earth's atmosphere, that is a failure and there's no disagreement about it. But failures in the personal realm or work-life balance realm are utterly subjective. We are societally very likely to see it differently based on gender, based on mother or father. And we know this, right?
Something that is seen as a success, you know, a successful or appropriate or positive behavior for a father can be coded very differently for a mother.
We don't have what you would call scientific evidence. I have plenty of anecdotal evidence in the classroom and also a theory. Okay, let's have it. So this is the unequal license to fail. And that can make, and I think does make, women more risk-averse, you know, in boardrooms and classrooms alike.
In my classroom, I have noticed over the years that women are substantially less likely to raise their hand with a mediocre comment. They put their own threshold higher. And I think of a classroom, and I try to convey this very clearly to my students, as a laboratory, right? As a place where here's where we can make mistakes So we don't make them out there.
The whole point of a classroom is to take risks, to get things wrong along the way to getting them right. Now, I understand it is a very social context and they want to be seen well in the eyes of others. But consistently, women act as if they're more risk-averse. They don't raise their hand.
And then they'll tell me that in my office, too, that they don't want to raise their hand unless they know it's a really good comment. And men seem to be less inhibited.
I just went from the blameworthy end all the way over to the praiseworthy end.
I meant to prepare a great deal more than I have. Uh-oh. I guess I've been preparing 30 years, so... Stitcher.
The natural tendency is just to look at what they call in hospitals the sharp end, the last person, the person at the bedside who administered that drug. But in fact, the chain of events goes back to the pharmacy and even to the IT folks who printed the label in a weird way.
Many times you have failures in organizations simply because one silo doesn't know what the other silo is doing. So these are learning events. One big reason we don't learn enough from failures is that we don't share them systematically enough.
I'm very happy to be called that. It seems like an upgrade. I became a scholar of failure because I wanted to be a scholar of organizational learning. So I came to graduate school with the idea, unformed, that organizations need to keep changing to stay relevant in a world that keeps changing. And they didn't seem to be very good at it.
I haven't met anyone who feels really good about failure, myself included. You have to force yourself to feel good about failure. And why do you think that is? I think it's our upbringing, right? By the time you're in elementary school, there's such a strong emphasis on getting the right answer or succeeding, not failing. And so...
We're not trained very well in the whole idea of uncertainty or novelty.
Sure. I think of them as emotional, cognitive, and social. So emotionally, we're just spontaneously averse to failure, right? I don't like it. I don't want to have it. I don't want to look at it, right? It's immediate. Cognitively, because we don't do a good job or don't have access to a simple framework to distinguish among kinds of failures, we then sort of decide to not like any of them.
And the fear part has to do with our concerns, very deep and deeply founded concerns of what other people think of us. So we don't want to be seen as having shortcomings. We don't want to be seen as associated with a failure.
We're good at failing. I mean, we are, by definition, fallible human beings, each and every one of us. And we will have failures. You know, the only real question is, how bad do we have to feel about it?
They're inadequate. When you say fail fast, fail often, big smile on your face, you know, most people, oh, yeah, I get it. I see innovation, blah, blah, blah. But at a deeper level, wait a minute. You know, failure is not good, right? I don't want a failure and I don't want to fail. So I'll pretend I agree with that. But in reality, no way. It's just wrong. Failure is bad.
But you know what I'm talking about. Slogans aren't enough. You know, slogans don't get you to the behavioral changes you need to make.
Think about it. You're in your job and suddenly you're getting an email from someone who's not your boss and not your peer, not your subordinate, not someone that you normally interact with to do your job. What's under it?
Think about it. You're in your job and suddenly you're getting an email from someone who's not your boss and not your peer, not your subordinate, not someone that you normally interact with to do your job. What's under it?
Think about it. You're in your job and suddenly you're getting an email from someone who's not your boss and not your peer, not your subordinate, not someone that you normally interact with to do your job. What's under it?