Two wrongs may not make a right, but what if your wrongs could become right? Harvard professor Amy Edmondson's new book, Right Kind of Wrong: The Science of Failing Well, explores the benefits of failure, and the conditions required to learn and grow from mistakes. Peppered with examples, large and small, from the Challenger disaster to her parent’s nearly-missed blind date, Edmondson shows us that we are surrounded with failure. Her book offers frameworks to understand human error and concrete advice to turn it into an advantage, building on (and clarifying) her famous work on psychological safety.
Psychological safety is a tough sell. Not because it isn't a critical ingredient for healthy and innovative organizations. Employees who feel psychologically safe at work are 76% more engaged, 57% more collaborative and 67% more likely to try new skills on the job. But psychological safety is broadly misunderstood and much maligned. It's not a free-for-all where people can say whatever they want without consequence. It's not an insurance policy against being fired. It’s not an excuse to be offended by everything and everyone. Sadly, the term psychological safety is often met with an eye roll or a cringe these days—just when we need its outcomes most: after all, what organization doesn't benefit when people can openly call out problems and missteps to prevent disaster? When people feel safe enough to try new things and innovate? Enter Edmondson's new book.
There's a well known mantra that successful companies “fail fast and often.” This glib saying masks a few sobering realities. First, failure triggers our fight or flight response, so we are hardwired to avoid it. Even young children are quick to avoid blame in the face of a mishap. Second, learning from it contradicts human nature: extensive research shows that we learn from other people's failures, but when it comes to ourselves, we glean insights only from our successes. Despite these obstacles, Edmondson believes we can reshape our understanding, remove the stigma and reframe our relationship to failure. We just can't do it alone. We need workplace cultures that celebrate learning and promote healthy relationships to make failure safe. Here are five important takeaways from Edmondson’s book.
- Fail Intelligently
The rightest wrong is an intelligent failure. Intelligent failures are the result of informed forays into new spaces, like hypothesis-driven research for new drugs or new technologies. They yield important lessons, deepening insights and informing future and better experiments. Innovative companies, Edmondson suggests, should proactively architect intelligent failure, by designing pilot projects with four key attributes: small bets that explore new territory, meaningful potential outcomes, and informed by existing knowledge.
To encourage intelligent failure, celebrate the lessons that emerge from each attempt. Thomas Edison made countless tries to develop a storage battery, calling each failure a valuable insight into what would not work. Chef René Redzepi, of the Michelin three-starred restaurant Noma, carved out special time for junior chefs to try new recipes, welcoming failure as a critical ingredient for an innovative culinary experience. Eli Lilly was known for its failure parties in the 1990s to encourage scientists to acknowledge failure quickly and free up resources.
- Minimize Mistakes
Sadly, not all failures are intelligent. The lion’s share of failure looks alot like basic errors or a network of mistakes that lead to a complex mess. Sometimes non-intelligent mistakes give us incredible inventions. The removable glue on a Post-It note was a failed attempt at super strong adhesive. Penicillin was discovered when a scientist forgot to wash his petri dishes before he went on vacation. Viagra was originally intended to treat hypertension.
Despite these rare eureka moments, most failures are basic flops. You ignore the weird noise in the car, you push off maintenance on the boiler because you are busy, you rely on intuition rather than data. Complex failures, Edmondson says, are compounded basic failures, with subtle warning signs and uncontrollable elements thrown into the mix: many small errors interact to create a huge disaster. The biggest oil spill in the UK when the Torrey Canyon tanker ran aground in the 1960s, the crashed Boeing 737 MAXs that grounded the plane and cost hundreds of lives. At the core of these tragedies lies the very human drivers of our most basic errors: inattention, neglect, overconfidence, faulty assumptions. And often, disproportionate financial incentives to succeed - at all cost.
- Shift From Blame To Praise
Reducing failure requires reframing its impact to reduce its stigma. It starts with how we treat mistakes. To highlight the hazy line between blameworthy and praiseworthy failure, Edmondson routinely asks leaders to estimate the percentage of blameworthy mistakes in their organizations. As they consider the causes of most missteps (on a spectrum from intentional sabotage to misaligned skills to outsized uncertainty to failed experimentation), they usually arrive at a small number: only 1-2% are truly worthy of blame. But when she asks how many of these failures are treated as blameworthy, the number balloons to 70-90%. Even complex errors have multiple causes that make finding fault challenging or irrelevant. A blame and shame culture is a failure in its own right. It impedes learning, and encourages deceit: people go to great lengths to hide failures that could cost them their jobs. Leaders at all levels must explicitly adopt and model a culture of blameless reporting, not only to permit early detection of potentially dangerous errors, but to destigmatize failure.
- Learn From Failure As A Team
One of Edmondson's most powerful insights is that “social systems prevent complex failures.” Humans may be the problem (it turns out we’re not infallible) but we are also the solution. We can’t fully eradicate failure, but the most effective teams have the skills they need to reduce danger and learn from it. They listen to each other’s concerns with a genuine desire to learn, and ask questions designed to surface quieter expert perspectives. They challenge their assumptions to avoid confirmation bias and expand understanding. They disagree with respect and without retribution, making it safe to speak up with bad news. And they act with empathy, putting themselves in others’ shoes. Edmondson’s keynote story stems from her own failed early-career hypothesis that high performing healthcare teams make and report fewer mistakes. She discovered the opposite: high functioning teams report more mistakes, because they’ve created the conditions to make identifying them more accepted. And they grow faster from the lessons they learn together.
- Practice, Practice, Practice
Building the human systems to prevent, and learn from, error takes time. But most importantly it takes practice. Mindsets and behaviors never change overnight. For Edmondson, practicing means finding ways to normalize failure. She points to the Andon Cord, a long established practice in the Toyota Production System. Workers who identify a problem are required to pull a cord that stops the line. Colleagues and leaders first thank them for identifying the problem and then support them with the resources they need to fix it and restart the line. She highlights Rapid Response Teams in hospitals, designed to support healthcare emergencies and remove the pressure from nurses to make judgment calls alone. Dry runs and drills have the effect of helping colleagues befriend error and vulnerability, prioritize safety, and catch issues before they become problems.
Failures happen. At the end of the day, it's not processes, protocols or rules that will prevent them. Managing failure is about empowering people. While it's human nature to screw up, you can create the conditions to boost accountability, learn from your errors, and avoid the sting of repeated failure. Whether you like the term or not, psychological safety is at the core of changing your mindset—it requires the genuine belief that you can try, fail or blow the whistle and the people around you will have your back. This is the only thing that mitigates the fear and aversion to making mistakes that comes pre-loaded in our human software. Building this kind of trust won't stop failure. But it will keep failure from stopping us.
First published on Forbes.com.