Nassim Taleb popularized the term “Black Swan” and has recently pointed out that COVID-19 can’t be categorized as such. He said it is a “White Swan” instead. Why? Because C-19 didn’t happen all of a sudden with a lack of predictability — we knew enough about it so that we could enact countermeasures, but many countries didn’t do so. It wasn’t, as he calls it, “an unforeseen problem” – which is his definition of a black swan.
Looking back to the early thoughts of Taleb in 2009 on the ideas of risk are quite interesting, as grounded in an HBR piece he penned with collaborators entitled, “The Six Mistakes Executives Make in Risk Management.“
The mistakes are enumerated as:
- We think we can manage risk by predicting extreme events. His point being that we’re not able to predict Black Swans, so why try? He argues that we must instead think about the consequences and to constantly evaluate the impact of such extreme events.
- We are convinced that studying the past will help us manage risk. He says, “Risk managers mistakenly use hindsight as foresight.” And points out how we love to use the word “unprecedented” in the context of crises. Real Black Swans don’t have precedents.
- We don’t listen to advice about what we shouldn’t do. An example he gives is that we don’t like advice that forces us to take something away from our lives. We tend to like “positive advice” instead because they’re easier to implement. He argues that we need to instead position risk-management activities as profit-generating ones to make “negative advice” into a desirable kind of activity.
- We assume that risk can be measured by standard deviation. There isn’t a single, easy way to model an extreme event as a single number, or by using a convenient concept like standard deviations that can measure “outliers.” Taleb argues that Black Swans exceed 10, 20, or 30 standard deviations and make the unit of measure irrelevant.
- We don’t appreciate that what’s mathematically equivalent isn’t psychologically so. A risk can be framed in one of two ways based upon the story that is told to a human mind. When presented as having a best-case scenario that is desirable, we tend to welcome risk; when presenting the worse-case scenario, we quickly walk away. Remaining unbiased when approaching a situation of risk is key, and likely impossible.
- We are taught that efficiency and maximizing shareholder value don’t tolerate redundancy. Given the collapse of the global supply chain due to COVID-19, it’s evident that our system of product creation was optimized to-the-max — such that one tiny thing going wrong could take it completely down. We tend to try and optimize for efficiency at the sake of other negative, unintended consequences that we never have to be accountable for — which is the root of the problem for corporations.