Those who have read Kahneman’s “Thinking Fast and Slow” would find Michael Lewis making a great impression on the same subject with his latest book, “The Undoing Project”.

I read the book in three hours flat and had to repeat it once more. Barring the first chapter, the rest were a page turner.

I must admit that I was always a fan of this duo Amos Tversky and Danny Kahneman, who helped the new field of Economics, “Behavioral Economics” move to its fruition. Their Prospect Theory and the four cognitive biases always stand out as fundamental theories in the field of decision making under uncertainty.

This essay is not a book review, but I have drawn the entire inspiration to write this from the ensemble of books and papers that connect this duo and the world of human errors in judgment.

Management Sciences have so far taken a small hint of this as much of what happens in Management is largely definitive rather than being probabilistic. The stakeholders around us ask for certain outputs, not chance forecasts, no matter how complex the task on hand.

No CEO would ever dare to say, “There is a 90% chance the product will succeed but a 5% chance that it will fail under special conditions’, which well could be the closest to truth. The way the market could interpret this statement is anybody’s guess.

Or no CFO would dare to say that debt to EBIDTA ratio could be range bound between 4:1 and 3:1 under extreme volatile condition of commodity prices. Management is about finding ways to get to certainty of outputs by acting on the imponderables.

It is interesting how human judgment could be skewed to new information, very systematic errors are made by the intelligent mind to deal with new information.

As new information steps in, the Bayesian world of conditional probability is the way to go, but the intelligent mind is completely at odds with the Bayesian world. If we say that the product launch had good results in few quarters, this information should serve as a base for assessing the potential results in the future. There is a Bayesian way that gives the right weights to this information and there is an intelligent way that works on the small sampling bias.

If new information is the basis for a stock to do well today, negation of that information should prompt the stock to not do so well the day after. However the intelligent mind could assign a lower probability to a more probable event and vice versa.

If under repeated drawing of samples from a bag of balls if the red comes out predominantly over white in the first two draws, this small sample distribution cannot fully resemble the actual population. Prior and posterior events cannot at the end equate the way we want them to without factoring the Bayesian conditional probabilities.

Making an assessment of a future scenario, no matter how complex it could be and no matter how many factors or variables we could be drawing the conclusion from, the expectation is to have a definitive final forecast, whether be it prices or market size or growth. You will rarely see probability and estimation errors or confidence intervals or testing of hypothesis in management forecasts or guidance.

Statistics makes management look weak and vice versa; leadership rather be about achieving definitive things, no matter how shaky the foundations are.

Am I exaggerating, well no, that is exactly how a budgeting must be done or a plan put in place. This is where we must look at some cognitive biases that the duo taught us.

Most forecasts is based on heuristics that could be swayed by powerful influencers. The biggest influencer is the latest information, which acts as a certain guidance for forecasting events distant into the future.

Think of the price forecast you are trying to make on a commodity, like Crude Oil or Cotton or Copper and if you have an influencing event happening today, the long term price forecast would be disproportionately swayed by this. Factoring a proportionate change stemming from the current information is what the Bayesian space is all about. It allows us to adjust correctly what the current events would influence the result of the future. As more information contradicting each other steps in it becomes even more difficult to put the right weightage to each one of them.

The intelligent bias is about these weightage we give to this new information.

As the human mind is willing to take risks where a gain can be made but would be risk-averse to where a loss is a certainty, the weightage factors get apportioned disproportionately. Assessing how a prototype would be doing in the future would need such weights to be put or the performance of a product in the new market or the launch of something completely new in an unknown territory.

All these are risk events and their success would depend on how management would be using new information.

Management of uncertainty is what management is all about, but rarely does statistics play a role. For pursuing a strategy that could sway the fortunes of a company, the least we could do is testing of hypothesis and estimating the output based on confidence intervals. Basic statistics is very simple and the core concept of probability is all about estimating a posterior event or about the population based on new information or sampling. It replaces the tedious job of dealing with large data sets and is actually cost efficient.

Knowing the systematic errors that the intelligent mind can make, it is all the more important that we do not get swayed away by noise factors and search for signals. The world of trading could draw immense lessons from the works of this this duo.

I remain the most ardent admirers of Amos and Danny; their work should lead us to believe that no matter how intelligent we are, we are prone to making very systematic errors in judgment.

Management: The intelligent errors we make in planning and forecasting

Leave a Reply

Be the First to Comment!

Notify of
avatar

wpDiscuz