There’s an overused quote that goes something like, “Smart people learn from their mistakes, but the smartest people learn from other people’s mistakes.”
Like most overused quotes, this one gets thrown around without a lot of thought. It’s not wrong, but it is incomplete and it can cause you to miss a deeper, more valuable truth.
This essay is a brief meditation on errors, and I hope it will leave you with a new perspective.
Chess computers are fascinating. Do you know how they work? When a chess computer is trying to figure out what piece to move, it simulates millions of games and then makes the move that most often leads to winning the game. This requires the computer to simulate losing a lot of games because it has to rule out the moves that will cause it to lose the game.
Unlike computers, humans can’t always simulate all possible moves and then choose the one with the highest probability of long-term success. But what humans can do is learn to embrace errors the way the chess computer embraces errors.
In my essay Failceeding, I talked about how as a company commander I allowed people to fail for the sole purpose of getting them to succeed more than they would if they never failed. I didn’t necessarily want them to avoid errors by learning from others’ mistakes. I wanted them to make errors to learn from their mistakes.
The quote is incomplete because it sets error avoidance as the goal, rather than error correction. Not only is avoiding all errors impossible, but it is also a foolhardy goal. In fact, a zero-defect culture, like that of many units in the Army, is counterproductive to progress and performance in the real world.
In his book Risk Savvy, Gerd Gigerenzer contrasts positive error cultures with negative error cultures. Positive error cultures are ones in which errors are embraced and studied with an eye toward improvement. Negative error cultures are ones in which errors are stigmatized, covered up, ignored, and forgotten as quickly as possible. The interesting thing about these two different kinds of cultures is how they come into contact with reality. Positive error cultures are most often faced with obvious problems that simply cannot be ignored because the reality is simply undeniable.
On a personal level, cooking is an activity that facilitates positive errors. If you cook something, you’ll know right away if you made errors. Something can be too salty, overcooked or undercooked, overpowering ingredients, or simply poorly conceived. You cannot hide the mistakes from yourself or from others. If you want to improve you have to figure out what you did wrong, which requires admitting that you did, in fact, do something wrong. If you have a particularly weak psychological makeup you may try to blame external factors, but ultimately it is you who are to blame.
Going back to chess, the best chess players don’t go back and look at their brilliant moves, they study their errors. They want to revisit exactly why they made a critical error, and what led up to it. The same is true in combat sports like Jiu Jitsu. These types of activities are in direct contact with reality and the participants get instant feedback about their performance, and that feedback cannot be easily manipulated.
But in areas with little to no contact with reality, or that heavily shield themselves from reality, you get negative error cultures with groupthink and nonsense. Nassim Taleb has a great quote that says, “it is easier to macro-bullsh*t than micro-bullsh*t.” A barber can either cut your hair the way you want or she can’t. She can’t BS you for you very long. A geo-political strategist, on the other hand, can dazzle you with grand theories about what is going to happen to the world, and when he is wrong, he can explain very easily why he was wrong and why you should listen to him next time. I mean, people still listen to Paul Ehrlich for goodness sake!
Intelligence, especially on an organizational scale, is not about eliminating errors but about adjusting from them. But in order to adjust, you have to actually have errors, and the errors have to come from the reality itself, not from a constructed or simulated reality. Intelligent organizations deal with reality on reality’s terms.
This idea about contact with reality is extremely important when we are talking about military operations.
When militaries do not have direct contact with the reality of war, their skills start to atrophy. If there is no environmental feedback to force units to face their mistakes and make adjustments, then militaries lose all of their organizational intelligence. We try to simulate environmental feedback through wargaming simulations, but these are often so plagued with problems that users are more likely to blame the game (or their higher headquarters) for the problems than to look at themselves. Dead icons on a computer screen are easily explained away—dead Soldiers on a battlefield are much harder.
The Global War on Terror was, in many ways, a disaster for the military because it led to two decades of platoon-level combat. The military, as an institution, was shielded from the realities of real war. Most civilians think that the military must be sharp because it fought a “war” for 20 years, so the military must be very experienced. This is incorrect. We were involved in a conflict where the vast majority of the people never experienced anything close to real combat. It was only soldiers at the tactical level who tasted what real war is about. The military lost its ability to fight at the division level and above.
To explain what I mean by real war (what we in the military call Large Scale Combat Operations or “LSCO” (prounced Lis-Co)), let me put it this way: we lost about 7,000 US servicemen and women over 20 years of conflict. It’s hard to know the exact figures of Russian or Ukrainian soldiers killed in the last year, but it’s well over 100,000 per side. That’s more than 10x the number of killed in 5% of the time. *When* the next big war goes down, we could lose 7,000 soldiers in a week. That is the reality of war.
Fortunately, the Army is starting to prepare for this type of conflict. Our doctrine is shifting, our training is shifting, and our mentality is shifting. We are starting to realize that we don’t know really know how to evacuate and treat 2,000 casualties from a single battle. We are starting to realize that we have critical gaps in our capabilities, and we are thinking about how we can address them. But what some units are missing is the determination to actually get serious about fighting wars.
A friend of mine was recently talking about how, in his unit, the senior officers are all getting ready to retire, and as a result, they don’t take the unit’s training seriously. When problems are identified, which is rare, the blame gets passed around and dissipated to the point where no one takes responsibility for fixing things. People don’t like bringing up problems because they are afraid they’ll have to do work to fix them. And who wants to do that much work when they are so close to retirement? This is a type of negative error culture because, there, errors are bad things.
I feel lucky to be in the unit that I am in because we take our profession seriously. When we find problems, we fix them. When there is friction or confusion in the plan, we figure it out and adapt. When we make mistakes we own them and we fix them. I’d go to war with this unit any day.
I hope that other units and organizations wake up and start taking their profession seriously. You can’t hide from reality forever, and when reality comes knocking, you better be ready to answer.