Jan - 19 - 2018

A Simple Monte Carlo Model for Publicly Traded Equity Valuation


This is a simple Monte Carlo model for valuing the equity of publicly traded companies. Now let's be clear. Valuation is never "simple." A valuation that seems simple and obvious to one may seem naive and silly to another. In the end it comes down to the model that you apply, and to the assumptions that you feed it. Do not ever think that a model of the future predicts the future. As the continuing financial meltdown demonstrates, the past is not a particularly reliable indicator of the future, and people do not behave according to a standard normal distribution. However, with careful analysis of the past, and clear thinking about the present and future, it is possible to get an idea of a range for the valuation (Black / Gray Swans and Cluster Flocks not withstanding). That is where Monte Carlo modeling comes into play.

What is Monte Carlo Simulation?

Let's start with a typical financial forecast model. Someone sets down and creates a set of linked financial statements - usually just the income statements and balance sheets. The most common method of constructing these pro forma statement is the "percent of sales method." This usually involves calculating each of the income statement amounts as a percent of that years sales (excepting certain things like interest payments and dividends which are not intrinsically driven by sales results). From there, a sales forecast is created based on some kind of growth rate and the model produces a "result." Full valuation models will also include a horizon value predicting the amount that the stock will be worth in a few (usually five) years. The horizon model is usually a simple dividend discount model such as the gordon growth model. From there an often arbitrary discount rate is applied to the cash flows to get a present value of the equity. Subtract the debt and divide by shares outstanding (or fully diluted) and you get a value for a share of the company's stock. A careful model may use the previous several years of financial statements (usually three or five) to calculate the percent of sales of each line item on the income statement, assume an historical inflation rate for the horizon value growth rate and start making decisions. This is a financial modeling hack. There are better ways.

First of all, you can be nearly 100% confident that every assumption in the model is incorrect. Will next year's sales be exactly as forecast - to the penny? Of course not. Will the cost of goods sold or the gross margin be exactly as calculated as the average of the last five years? Of course not. Even last year was not the mean from the previous five years. This leads to two questions:

  1. To what factors and assumptions is the model sensitive?
  2. How do these factors and sensitivities interact?

The first question should be answered in every financial model. As a professor of mine (Dr. Vinod Venkiteshwaran) put it so eloquently, "If you are not doing sensitivity analysis then you are not doing financial analysis." Anyone with Excel experience should be familiar with data tables and the techniques of sensitivity analysis. You build a data table varying every one of your assumptions and monitor how the valuation (or other figure of merit) changes, then sort from most to least change. This provides a list of the things - in order of importance - that management (or the financial analyst) should monitor and examine very carefully.

The second question is a lot more difficult to answer. If you have 50 assumptions (a relatively small number for a serious financial model) then you have 50 X 50 = 2,500 potential interactions. If one assumption changes by 5% how does that affect the sensitivity of the other assumptions? And that is only the first order effects. What about second and third order effects? Does a change in one cause another to increase the sensitivity of several others? At this point we have a functionally infinite set of potential of interactions. There is no way to do a point-by-point analysis of this many interactions. This is where Monte Carlo modeling comes in.

According to Wikipedia (accessed 7/24/2012):

Monte Carlo methods (or Monte Carlo experiments) are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in computer simulations of physical and mathematical systems. These methods are most suited to calculation by a computer and tend to be used when it is infeasible to compute an exact result with a deterministic algorithm.[1] This method is also used to complement theoretical derivations.

Monte Carlo methods are especially useful for simulating systems with many coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and cellular structures (see cellular Potts model). They are used to model phenomena with significant uncertainty in inputs, such as the calculation of risk in business. They are widely used in mathematics, for example to evaluate multidimensional definite integrals with complicated boundary conditions. When Monte Carlo simulations have been applied in space exploration and oil exploration, their predictions of failures, cost overruns and schedule overruns are routinely better than human intuition or alternative "soft" methods.[2]

The Monte Carlo method was coined in the 1940s by John von Neumann, Stanislaw Ulam and Nicholas Metropolis, while they were working on nuclear weapon projects (Manhattan Project) in the Los Alamos National Laboratory. It was named after the Monte Carlo Casino, a famous casino where Ulam's uncle often gambled away his money.


Monte Carlo methods are used in finance and mathematical finance to value and analyze (complex) instruments, portfolios and investments by simulating the various sources of uncertainty affecting their value, and then determining their average value over the range of resultant outcomes. This is usually done by help of stochastic asset models. The advantage of Monte Carlo methods over other techniques increases as the dimensions (sources of uncertainty) of the problem increase.

Monte Carlo methods were first introduced to finance in 1964 by David B. Hertz through his Harvard Business Review article, discussing their application in Corporate Finance. In 1977, Phelim Boyle pioneered the use of simulation in derivative valuation in his seminal Journal of Financial Economics paper.

Now, suppose that for each of the assumptions in the model we were to jump to the unjustified conclusion that the correct value is normally distributed around the mean which we calculated from the last few years of financial statements. It is a big jump, but since we do not have functioning crystal balls we are going to have to add another largely unjustified assumption to the model. Now we can randomize all of the assumptions within a range around their means and see how that affects the model. Of course one randomization is not likely to hit the correct value because the randomization will jitter hither and yon. But, if we randomize several thousand times we can begin to see where the data generally lands. We can construct a mean, and confidence interval around the mean. (Since we cannot know the population standard deviation and will be using a point estimate based on the sample we will use the T-Distribution to establish the confidence interval.) We can produce a range that we expect the valuation should land within. That is Monte Carlo modeling.

There are many programs similar to Oracle's Crystal Ball or Structured Data's Risk Amp that provide much more complex and flexible Monte Carlo modeling functions such as selecting among various distribution types, interlinked ranges, and automatic summarization of results. If you are actually going to make a financial decision based on the results of a simulation then you should use one of these products instead of the spreadsheet available here. Which brings up the next topic...


This is an experimental spreadsheet model that you can download from some guy on the internet. I developed it as an experiment and am posting it for educational purposes. I do not recommend that you make financial decisions using this model as it is not provably correct, conceptually grounded, and probably not even safe for children to look at. If you do choose to use it for some significant purpose you are doing so at your own risk. I don't know about it, condone it, recommend it, or even think it is a good idea. If you use it and make a lot of money I would appreciate some of the cash. If you lose money that's your problem and I will not have any part of it.

While the spreadsheet is stable, this article documenting its operation and use is a work in progress so check back from time to time to see how things are going.

The Excel model is located here: MC_Single.xlsx


The basic operation is that you do your work on the Valuation tab. You just enter your assumptions in the Green cells under the Assumptions heading on the left, and see the simulation results under the Valuation heading on the right. The simulation results are also displayed as a histogram below the value per share range. Of course how you develop and choose the values that are entered into the assumption cells is a very complex question. So is the actual interpretation of the results which are displayed in the section labeled "Confidence of Value per Share."

The Financials tab forecasts the underlying financial statements and H-model horizon value. The values in the financial statements are influenced by the values in the Casino tab which are randomly distributed according to the mean and standard deviations entered in the assumptions section. The actual results of the financials and horizon value for each of the 2,000 cycles are recorded in the Simulation tab and summarized back in the Valuation tab.

Operating Design


How Does it Work?


A Tour of the Tabs


Where Do I Get The Data?


What Does It All Mean?


All Original Content Copyright 2014 Jon Bennett. All Rights Reserved.

Site Powered by Joomla!

Site Template (bizglobal) by mambasana.ru