I get a lot of emails asking what program I use to generate my charts so I thought it would be worth discussing this and a few other process-related questions in a separate post. The short answer is that I use LibreOffice Calc, which is an open source version of Microsoft Excel. Its capabilities far exceed my needs and its free. It is compatible with .xls and .xlsx files, and the developers have done an excellent job ensuring that it behaves the way Excel behaves, so the learning curve is pretty gentle and there are no barriers to interacting with people that use Excel. Did I mention that it’s free? Other than Excel’s pretty ribbon interface, I have not yet come to any feature that I miss.
The next question that often comes up is how I get fundamental corporate data into my model. This is an easy one: I input it manually. There are services that you can subscribe to which allow you to do this automatically, but they are quite pricey and aimed toward investors with a significantly larger portfolio than mine. There is also a free solution using the Stock Market Functions Add-in (Excel only unfortunately, and no Macs either!) which grabs data from free online sources like Yahoo, Google and Morningstar (among several others). I toyed around with this last year as the opportunity to automate this process appeals greatly, but my experience was negative overall. Because the data are sourced from various websites, they suffer from the same limitations as those websites, namely that various line items that exist in the source material (the 10-Ks and 10-Qs) are categorized in order to simplify and standardize the presentation. This has the effect of removing an important level of information, in that many things get stuck under “Other.” This is fine if you only want a general understanding of how things are going for the company, but I am wary of eliminating important data, especially this early in the process.
The short story is that unless you can afford to pay for the top of the line services for full digital financial statement reproduction (Bloomberg terminals do this, and it appears that Fetch XL does this too), the gold standard is manual input. I do this for all of the companies I research, and usually I go back as far as the SEC’s data goes (c. 1994), inputting all of the annual data for the three main financial statements, and then I look at quarterly data usually up to two or three years unless there is a compelling reason to go further. Moreover, there is always information that is industry and company specific that I also grab, like Average Selling Price, Stores Open, Comparable Store Sales, etc. I cannot stress enough the importance of this information, especially for detecting possible financial shenanigans (Read Financial Shenanigans or my in-depth review to learn more), and now that I think of it, this is another reason why it is important to grab the information manually, since I don’t think any of the automated solutions grab this and you simply cannot afford to ignore this information.
I’ve been asked several times how I investigate companies, so I’ll give a brief overview here. Once I have manually input all of the fundamental data in my spreadsheet, I check the graphs. Since I’ve standardized the way I input the data, my spreadsheet is set up to calculate relevant ratios and other second-order information automatically. The output of these calculations feed into a variety of charts that allow for the easy identification of trends and red flags. I should note that I do all of this before I read through a 10-K or conference call transcript. The rationale for this is that the red flags make me alert for things to watch for when digging through the company’s other filings.
After the graphs comes the documentation, which includes at least one 10-K and usually recent 8-Ks and the most recent conference call transcript. If, while checking the calculations and graphs, I notice something in the company’s history that I can’t explain, I’ll read the 10-K or other material from that time period to try to get an explanation (e.g. big drop off in revenues? check for information about divestiture or deconsolidation). After this, I take a stab at the valuation. Normalized historical performance is the starting point of my valuations, though I make considerable changes for persistent trends (which would render normalized performance inapplicable) and I use what I’ve learned from the conference calls and other material to help guide expectations for the future (e.g. the ongoing impact of increased competition on margins, or the price of a commodity on margins).
If my valuation looks good, I will dig more deeply as necessary into other source material, such as the DEF-14A proxy statements (for source information on ownership structure and executive compensation) and any of the SC-13Ds over the last two years (which show whether there are activists involved and often have good commentaries from those agitating for change). Often specific questions can be answered by searching through older conference calls. It is often worth looking at competitors and their performance relative to their valuation as well (though I rarely let relative valuation guide an investment decision).
So there you have it, I’ve now provided a bit of insight into my process. What does your process look like? What tools do you find indispensable in analyzing investments?
Author Disclosure: None.