• Print

Henry Petroski: The value of failure

Failure is important because of the information it reveals and because it combats the human tendency to grow overconfident, says an engineering professor and author.

Photograph by W.E Fretwell
After the floods of 1926, a train had just passed over the Fremantle Bridge in Perth, Western Australia, when it collapsed.

May 17, 2011

The biggest misperception people have about failure is that it is all bad, said Henry Petroski, a professor of engineering and history at Duke University who researches the role that failure plays in design. “But from an engineer’s point of view,” he said, “a failure can contain all sorts of helpful information.”

It reveals weaknesses, helps make things stronger and offers lessons in humility, he said.

Petroski is the author of 15 books, including “To Engineer Is Human: The Role of Failure in Successful Design” and “Success Through Failure: The Paradox of Design.” He has also written histories of the design of the pencil, the toothpick and the bookshelf.

“Failures that happened 2,000 years ago can still be instructive today,” he said. “Not that we’re trying to do the same things the ancients were doing, but we’re using the same intellectual tools those people were.”

Petroski spoke with Faith & Leadership about what he’s learned about failure from his research, how it influences his teaching, and why a bridge collapses about every 30 years. The following is an edited transcript.

Q: Why are you more drawn to stories of failure than to stories of success? What’s the value of failure?

Success stories don’t teach us anything but that they are successes. They are things to emulate, but the word “emulate” means two things. One, it means effectively to copy. Nobody wants to copy. Everybody wants to be more creative. They want to do something better. So “emulate” also implies trying to go beyond -- trying to make it better, somehow bigger, whatever the measure is.

Successes are not very interesting other than in that regard. When we do go beyond, then we move generally closer to failure. And what interests me about any failure is that it presents real lessons to be learned, because there’s no ambiguity. When something fails, it failed.

Generally, failure does several things. One, it shows us when something is not working as we had planned. That’s one definition of failure. You design something, you expect it to behave or perform a certain way. If it doesn’t, then there are lessons to be learned from that.

The earthquake in Japan is full of lessons. Nobody in Japan -- the engineers, the planners -- wanted things to happen the way they did, obviously. This earthquake and the subsequent tsunami were so overwhelmingly large compared to previous ones that the Fukushima nuclear plant was overwhelmed. Certain things that were supposed to back up and provide safety didn’t, and as a result, the accident was worse than it might have been.

But there are lessons that people can take from that -- not only the Japanese but anybody else who is designing nuclear plants in any other country and even beyond the nuclear plants. You can generalize from these things.

For example, the nuclear plant had been operating OK for years, and so you could say it was a success up until it was a failure. While it’s a success, it doesn’t teach much. All you can really logically conclude from it is that it’s a success. It’s doing the right thing. But then when something extraordinary happens and something overwhelms it, it reveals its weaknesses. That’s the value of failure.

Q: How do you, as a teacher, get your students to see the value of failure?

I use a lot of case studies and stories. Generally, I tend to use more historical ones, because those stories are fixed and static.

I talk about things that are happening now in class, too, but always with the caveat that “This is what we know as of today. Tomorrow there may be something new that comes out.” If we, again, go to what’s happening in Japan right now, every day there’s a different story in the news, and it gives us a different perspective. It’s sort of a moving target.

It’s still a valid teaching tool, but I like to use historical case studies, because they are more fixed in time. I also like to use them because they have valuable lessons for the present and the future.

I give my students readings that go back to Vitruvius’ “De Architectura,” which is a book that was written in the first century [B.C.] about Roman and Greek architecture that talks about failures back then. The lessons in there are as clear as if we’re talking about something that is in the newspaper today.

I also read with them from the work of Galileo from the 17th century. He opens up one of his books talking about a whole bunch of failures. He basically uses them to say, “Well, we Renaissance engineers don’t quite have it all figured out yet. What are we missing?” And it’s that attitude -- that context of failure -- that helps the students get started on figuring it out.

I like to see students come out of my courses with a collection of these horror stories or these failure stories, so that if they’re engineering students and they’re designing something someday, they’ll say, “You know, what we’re doing reminds me of what they did to that thing that failed back in the 19th century,” because, it turns out, there are very similar ways in which things fail.

The same human mistakes tend to be repeated over and over and over.

Q: What are those?

They tend to be lapses in logic, thinking that you’ve covered all the bases when you in fact haven’t.

Overconfidence is a really big one, and overconfidence tends to come in a climate of success.

When there are no bridges collapsing, when they’re all working and everybody is happy and driving across bridges without a care in the world, engineers, being human, begin to think maybe we’ve finally got it figured out. So they may become a little careless or they may say we don’t have to inspect these bridges as frequently as we do, because we’ve finally got it figured out. And sure enough, sooner rather than later, something is going to go wrong. And it does. The historical record is undeniable.

But then the converse of that is when something fails, when something goes wrong, that puts everybody on alert. Everybody then says, “Oh, well, we obviously haven’t got it figured out. We’d better be more careful with our thinking or reasoning or logic when we’re designing something. We’d better check and recheck our calculations, our models, and so forth and so on.”

We’ve been doing things wrong even before recorded history. That, I think, is a strong argument that it’s the way we think or the way we don’t think and the flaws in our logic and maybe even in our attention span. We get careless now and then.

Whenever a big failure happens, there’s usually an investigation, especially if lives are lost. Committees and commissions -- they write big reports, and those big reports contain a lot of stuff, but if you slog through them, you usually find these same things.

Q: You mentioned the historical record, and in your books you write about a 30-year pattern with bridge collapses. Describe that.

People have studied bridge building over almost a century and a half or more, and it’s been well-documented. They have noticed that there is a major bridge failure about every 30 years.

The question is, why 30 years?

One of the explanations is that this is about the duration of a professional generation. An engineer’s career is about 30 years long, roughly. What happens is that these young engineers are coming in and the older engineers are, at the same time, moving out.

The older engineers have all this wisdom and experience. But many organizations don’t have a formal procedure for taking that knowledge that’s in the older generation and imparting it to the newer generation. Or even if they do, the younger generation is sort of cocky and thinks they know more. After all, they’ve just gone through school and they’ve learned the latest stuff. So even if there is an attempt to pass on the wisdom to the younger generation, the younger generation either rejects it or doesn’t take it very seriously.

Then the younger generation, depending on where it comes in the cycle, if it doesn’t experience failures directly due to its own miscalculations, it gets cocky. It gets comfortable, overconfident and complacent, and all of those qualities welcome mistakes and they lead to failure.

The 30 years comes from actual data. If you look at the bridge failures, you can tick them off if you want to. But getting a really predictive model about how it operates hasn’t been successfully done yet.

The literary scholar Franco Moretti also found that styles of the novel change about every 30 years.

Q: Knowing about this 30-year pattern of bridge failure and generational replacement, how does this influence how you teach?

One thing I try to do is make the students aware of it so that they are not just swept up in it without thinking. If they go and begin to work for a company and the boss says, “We’re going to assign you a mentor who has been doing this stuff for 25 years, and he’s got a lot to teach you,” they’re not just going to blow it off. I hope they aren’t.

I don’t know how seriously they take me, but I’m expecting that maybe they’ll take it seriously. I guess that depends on how convincing I am.

Q: What should leaders keep in mind about failure?

Some of the brightest minds that have ever lived -- people that we recognize as geniuses -- have made mistakes. I don’t think anybody denies that Galileo was a genius, a brilliant person. He made mistakes. I’m not talking about personal mistakes; I’m talking about mistakes in his engineering and science.

He made the wrong assumptions in some cases. People followed the products of his reasoning for almost a century even though the result was wrong. The process wasn’t wrong, but the result was wrong because he assumed things that were not true. So he started from a false premise.

I think that’s very humbling. If an undisputed genius can make a mistake and publish it, and it can escape eyes and scrutiny for almost a century, then we’re pretty presumptuous to think that’s never going to happen to us.

Q: What can be learned from engineering about how to handle mistakes? When should failure be acknowledged publically and when shouldn’t it?

In a lot of cases engineers have no control over whether it should be made public or not, because it’s already out in the open. If a bridge fails, you can’t cover it up. It’s there.

But not all engineering mistakes are made public. One of the expectations is that the engineer will catch his or her own mistakes while things are still in the planning stage. An engineer will usually be working in teams, in part because the projects are so large that no one person can do them. But there’s always the expectation that your work conforms to a convention that another engineer can follow and catch errors or catch failures.

An engineer, for example, might be trying to design a bridge strong enough to carry 50,000 pounds. He might go on and do all the calculations with 50,000 pounds, but if he showed them to somebody, that person might say, “Don’t you think some trucks going across that bridge are going to be heavier than 50,000 pounds?” Well, that’s the kind of mistake you want to catch before you start building the thing.

We all experience this kind of error and correction. Those kinds of errors and failures are seldom made public. There’s no need for them to be made public. They’re part of the process of getting it right. It’s a self-correcting process.

But one of the reasons for making failures public, and a lot of engineers over history have advocated talking about failures, is because why should you keep these lessons from your colleagues who might be in another country or across the world or even in a different generation?

Very distinguished engineers, especially during the Victorian era, wrote rather eloquently about how important they thought it was that failures, as well as triumphs, be reported. Lessons in humility, perhaps.