Balancing the trade-offs between focusing on research and managing client relationships, can often come at the expense of growing assets. In addition, establishing a disciplined and repeatable investment research process that outperforms its benchmarks is not an easy undertaking. It requires time that many investment professionals simply don’t have. It’s especially difficult for small/mid-sized firms who are competing with large established firms that have huge research budgets and a platoon of analysts. Investment managers have communicated these issues to us numerous times over the years so we have designed solutions built around those needs.
To position yourself for growth and to understand some of the portfolio management trade-offs, it’s helpful to have an overview of some of the building blocks that make up our database. This can help as you establish a process that you can use to leverage and manage your equity portfolios efficiently.
In building a systematic/quantitative database the first thing required is data, such as fundamental, earnings estimates and daily pricing. Once we receive the data feed, we go through an extensive quality control process which flags any anomalies that need to be investigated and potentially corrected by an analyst. Data without intellectual capital behind it to extract insights is meaningless, so once the data has been cleansed, we convert data into intelligence. This production process happens weekly to incorporate updated company fundamental and earnings estimate revisions that may have happened. Company pricing data gets updated daily, which impacts deviations from intrinsic value and the relative rankings of factors.
Our quantitative equity research engine systematically applies bottom-up fundamental analysis on over 20,000 companies worldwide to build proprietary factors. The ability to quickly and efficiently assimilate a broad range of information, and build complete discounted cash flow models on a vast number of companies, gives our clients huge advantages in building equity strategies.
Any time we discuss the efficacy of our quantitative models and back-test performance, we utilize real “point in time data” that we have been maintaining for 20 years. The live data provides a major advantage relative to someone building a research process today and back-testing factors on companies as it avoids “survivorship” biases and uses as reported data. This is why our database and factors are highly respected by institutional money managers and used for inputs to their own process.
Needless to say, having a team of quants, database engineers, analysts, and programmers that can process and interpret vast arrays of data helps in developing and maintaining the research database.
Leverageable Equity Research Platform
We designed our research platform, www.Equityinsights.com for clients to harness our intellectual capital and to enable them to overlay their own process right on top of ours. Clients can essentially leverage all of our analytics to screen, build their own DCF models, analyze and monitor portfolios and to build their own custom quant strategies with our factors.
The flowchart below is a visual representation of sample factors clients screen on to create an approved list.
Approved lists are extremely helpful for larger organizations to ensure nobody goes rogue. We have also found that bank compliance departments find our grading system very handy during routine audits to explain their process.
Once an approved list is created, trade-off decisions on the actual managing of the portfolio need to be decided. For example, are there Portfolio Managers and Analysts that can review each stock, or will it be primarily a quantitative model? How many stocks should there be in the portfolio? Is it a sector neutral strategy? How often should the portfolio be rebalanced and how will that impact turnover and performance? These decisions are not trivial, and in order to make the best decisions you not only need empirical evidence in order to have confidence in your process, but you also must have the personnel resources to efficiently implement them.
In the matrix table below we can view the performance of a quantitative strategy using AFG’s multi-factor Investment Grade methodology and the effect of different rebalancing strategies on performance and turnover. Implementing the proper rebalancing strategy for the desired results of a portfolio are an integral part of maintaining disciplined process.
To the far left we have various benchmarks such as the Russell 1000, followed by “Hold” criteria. To right of the “Hold” criteria we can see the returns of the universe vs. only holding “A” grade companies while they remain “A” grade. This group of stocks returned 14.2% with monthly rebalancing which is great performance, the turnover however ends up extremely high (199%) by following this monthly rebalance strategy. This level of turnover might be suitable for our hedge fund clients, but not for those managing money for high net worth individuals. Now if we look at the second “Hold” row which represents buying “A” grade companies, and holding them even if they deteriorate to a “B” (selling when they turn C) we see that the trade-off for lowering turnover by 76% is the shaving off of a few bps in outperformance. If we look a little farther to the right, we have more trade-off options, such as rebalancing quarterly instead of monthly which results in further lowering turnover.AFG Global Research Database: 9/30/1998 to 12/31/15
These are few examples of quantitative strategies that our clients can leverage within our platform and the trade-off options that face money managers. We also provide support and consulting to clients to ensure the implementation of best practices.
Fundamental analysis on an individual company provides intimate knowledge of the inner-workings and qualitative aspects of a firm that a quant strategy can’t provide. Take our analysis of MasterCard, for example, researching companies in this level detail is extremely time consuming and labor intensive, even more so for a broad universe of stocks, unless you have a team of analysts at your disposal. If you are a manager with limited resources, having a meaningful and effective way of filtering down a list of stocks to focus your time only on stocks for further due diligence is vital. The chart below illustrates some general trade-offs based on management style.
Selecting an appropriate Quantitative/Due Diligence Mix is important in determining where your time will be spent. For example, we were getting feedback from investment managers that it was difficult for them to balance their client relationships while also staying on top of the day-to-day noise associated with the stocks in their portfolio. These were high-net worth managers who had to take into account client tax liabilities, yield and other different client circumstances that made it challenging for them to wear so many different hats. In most cases for these managers adding alpha to a portfolio for a high net worth individual was secondary to first doing the portfolio no harm – preserve the wealth.
We consulted with many of our Bank/Trust clients and smaller RIA’s about their needs and we found that they were seeking a strategy that could outpace the S&P 500, was sector neutral and also had low turnover. The goal was to essentially be a strategy that was an extension of their own research department and fully supported with reports and custom models. So in June of 2006 we launched the AFG 50 with these mandates as described on our equity research platform:
The AFG50 is a focused group of 50 stocks designed to consistently outperform the S&P 500 index. The AFG 50 is long-only, with targeted annual turnover below 40%. It is sector-neutral relative to the S&P 500, and is rebalanced every quarter.
The stocks selected for the AFG 50 must meet a set of decision rules developed by The Applied Finance Group related to each company’s intrinsic value and market price. More importantly, each company meeting the first hurdle is then reviewed by AFG’s analyst team to address the qualitative issues that the default model may not identify.
Each company has a custom model updated by AFG analysts every quarter, who also provide prompt comments about the latest developments of the companies in the AFG50. AFG analysts also strive to achieve a desired level of diversification for the 50 names, to avoid any unintended exposure to a given economic factor.”
Twelve years later, the AFG 50 has delivered about 175 bps of alpha per year on average, with less than 20% annual turnover. Out of the 12 years it has underperformed three years. Statistically, our factors tend to outperform 60% to 70% of the time, so we expect periods of underperformance. This is why it’s imperative to have a very disciplined process, to avoid the biases associated with the temptation of chasing returns.
The process for the AFG 50 has not changed, but the implementation has evolved. For example, it’s common that a trust department brings in a new account with several legacy positions. We have built tools to help transition these portfolios to the AFG 50. Also, clients requested a backup list to identify likely candidates that could potentially replace a stock in the AFG 50 to ensure they never liquidate a holding that could potentially end up in the portfolio. Recently, GICS added REITS as a sector, so naturally the AFG 50 did as well.
So when evaluating how you can continue to grow AUM and maintain solid performance within your equity portfolios, it’s imperative to evaluate all of your options and trade-offs in managing your client portfolios. If you are considering hiring additional personnel, I’ve found that’s it’s best to do that when there is a clear process in place that is well defined. Over the last 18 years of working at AFG, I have met with thousands of investment management firms. The dynamics among them are vastly diverse, but the most successful portfolio managers and analysts have one thing in common, they have a leverageable and scalable process that enables them to focus on what they do best.