Challenger
Writings about uncertainty, complexity, history, leadership, and more!
Challenger
Word Count: ~800
Read Time: 3 minutes
In my second semester at business school, I took a class on organizational leadership. For one of the classes, we broke up into teams and were given a fictional scenario called “The Carter Racing Case.” In the scenario, we played the role of crew chief of the Carter Racing team. Our racing team had been plagued by a series of engine failures that we had a hard time explaining. One of the characters in the case thought that the cold weather had contributed to the engine failures, but another member (who was much more senior and provided much stronger evidence) claimed that it wasn’t the cold weather, it was another technical issue which he fixed. In the case, it was the morning of the biggest race of the year. The team had worked hard all year to get us to this point, it was a very cold morning in the scenario, and we were now faced with a choice: to race, or not to race.
If we decided that we should race, there was a very good chance that, as long as we finished, we would be able to turn a profit for the team. If we chose not to race, there would be no profit for the year, and we would have to start from square one the next year. If, however, we decided to race and the engine failed, we would lose our biggest sponsor and there would be no more team, and there would be no more next year.
Out of 14 teams in the class, 12 decided that putting the car in the race was the best decision. Those 12 teams applied all of the tools that we had learned in business school. They evaluated all of the available evidence, they made a decision tree with all the probabilities, they calculated the expected monetary value considering all possible scenarios, they steeled themselves against the bias of “loss aversion,” and they decided that the rational thing to do was to put the car on the track and race, despite the risk. Only two teams (including mine) decided to take the loss and not race. After all of the teams had made their decisions known, along with their very sound reasoning, the professor asked the two teams that had chosen not to race if we would like to change our position.
I looked at the professor and said “Not a chance. Staying off the track is a no-brainer.”
She asked the two teams, “Why are you choosing not to race when most of your classmates have made a clear case that racing is the rational choice?”
I looked at her and said, “Four words, professor: Never Bet The Farm.” The other team agreed, racing wasn’t worth the risk.
The professor gave us a big smile and told everyone to break up from their team huddles and go back to their seats. She turned to us and said, “This case was based on a real-life scenario that occurred in 1986: the Space Shuttle Challenger disaster.”
As it turned out, choosing not to race was the correct choice, even though it is designed specifically to make people want to race. The details of the racing case were designed to mirror the situation that the NASA decision-makers faced on the morning of January 28th, 1986. Their decision to launch, despite the warnings of some engineers that cold weather had the potential to cause a catastrophic malfunction, resulted in the greatest disaster in the history of the US space program.
After the class, I approached the professor and asked if she was aware of something called “The Green Lumber Fallacy,” which is one of my favorite parts of Nassim Taleb’s book Antifragile. The fallacy, although slightly vague, occurs when someone mistakes non-important information for the most important information, while disregarding the actual important information.
In this instance, the other MBA candidates in my class had a whole bunch of really useful information. They aren’t stupid people (they are all way smarter than me), they flawlessly applied exactly what they were taught. The problem is that they applied their tools in a context that didn’t call for them. All you had to know was that if the engine failed (as it had several times), there was no more next year, it was game over. If you simply take the loss and identify the actual cause of the engine failure, you can race the next year and the next. No matter the context, you have to avoid the risk of ruin.
In summary, never assume that information, or tools that you have to interpret it, will allow you to make perfect predictions about the future. Especially at large scale, complexity and uncertainty will always reign supreme, and you must always make allowances for them. No matter how certain you think you are, Never. Bet. The. Farm.
Links to Check Out:
I’d love to hear from you! Just reply to this email and I’ll get back to you.
If you know of anyone who might like this newsletter, please forward it to them. If this email was forwarded to you please use this to subscribe.
From My Bookshelf
📚 Buy Things Fall Apart by Chinua Achebe
From My AirPods
🎧 A friend shared this YouTubeChannel with me and this guy is incredible.
Best cover of Wayfaring Stranger I have ever heard. Check it out!