NOTE: The magazine, Mechanical Engineering, refused to publish this.
Risk Management By Paid-up S&A subscriber Jay Lipeles
To the Editor:
I read with alarm, the cover story in your current issue: "Risk ? Informed Decision Making". The basic message that risk can be managed is wrong and extremely dangerous. Mechanical Engineering does a great disservice to its readership and the engineering community at large by promoting such nonsense.
To site a few examples: In the investigation that followed the Challenger disaster, NASA testified as to its risk assessment made prior to the event. The assessment was wrong. Seven astronauts died, and it cost the nation millions of dollars and several years to recover. To my mind, NASA never has, witness your article. Bad thinking remains. The error in NASA's assessment (and related risk management theories) is that inherent in the theory (and usually unstated, as in your article) is the assumption that the situation is ergodic. NASA's assessment was based on (among other things) test data taken at room temperature. The probability of failure of the O rings was based on that data and it was incorrect. The situation was non-ergodic. The theory was inapplicable. The analysis was wrong and disaster followed.
Long Term Capital Management (LTCM) lost $ 4.6 billion in 1998 and subsequently failed. Among the reasons for its failure was it assessed its risk with an analysis similar to the one in your article, which assumed that the situation was ergodic. It was not. When Russia defaulted on its debt, the mistake was revealed and LTCM went down.
Now we are struggling through the worst recession since the Great Depression. It was initiated by the collapse of the subprime market. It goes without saying, that there were a number of contributing causes. Among them was the bundling of mortgages into securities backed up by risk analyses that predicted the risk to be very low. The analyses assumed that the situation was ergodic. Whoops! In all of history, was there ever a more costly mistake?
Inherent in all analyses of this type is the assumption that the situation is ergodic. It almost never is. What the authors would have us believe is that a careful, thorough risk analysis can be helpful in reducing risk. Nonsense! What such an analysis does is provide a sense that risk is being addressed when it is not. It contributes to a false sense of security.
The most important quality an engineer brings to the game is his judgment. Greater reliance on risk management really implies (and demands) reduced reliance on judgment (engineering, financial or whatever). It is a very bad strategy, based as it is on an assumption known to be wrong. It is an extremely dangerous strategy. We should instead be relying more heavily on, and developing the people, who possess good judgment.
(investment newsletter Ferris comment: Thank you, Jay, for this intelligently crafted letter, which I believe is right on the money. Judgment is what investors need, not fancy math models that contribute, as you point out, "to a false sense of security." Well done.)