The Future Landscape of Performance Attribution

Date: September 16, 2009

It is now 20 years since the publication in the Financial Analysts Journal of the “Determinants of portfolio performance” by Brinson, Hood and Beebower, and together with “Measuring non-US equity portfolio performance” published in the Journal of Portfolio Management a year earlier by Brinson and Fachler, these papers collectively now known as the “Brinson Model” have provided the foundation for much of the development in performance attribution in subsequent years.

Recently the number of papers published seems to have increased and we have seen an apparent acceleration in the evolution of performance attribution methodologies, but to date very little to challenge the Brinson orthodoxy. In 1991 Allen, and in 1992 Ankrim and Hensel introduced multi-currency attribution models, but by far the most important paper of 1990s must be “Global Asset Management and Performance Attribution” published by Denis Karnosky and Brian Singer in 1994, based on the continued efforts by Brinson Partners to address practical issues in the management of global portfolios. This paper not only establishes the importance of correctly identifying interest rate differentials in multicurrency portfolios but asserts that the management of multicurrency assets must be sub-optimal if currency is not managed as an independent asset. Clearly given Karnosky and Singer’s heritage working for Brinson Partners their methodology is firmly embedded in the Brinson Model approach.

Whilst Karnosky and Singer is the most significant paper published in the 1990s it does have its flaws, most obviously, the reliance on continuous compounded or log returns which are never, in my experience, presented to clients. It could be argued that only deviations away from the benchmark must adjust for interest rate differentials. By using the equity risk premium concept, the reference benchmark in the non-cash part of the Karnosky and Singer model implicitly adjusts for any deviation away from base currency including the strategic benchmark. This makes sense from the perspective of the original paper (which favours managing currency as a separate asset and is presented from the US domestic point of view) but not from the perspective of the portfolio managers who should not be impacted by the strategic decision to invest in non-domestic assets – in other words the benchmark exposure to non-domestic currency assets is achieved for free. Nevertheless, this effect is not significant and relatively easy to resolve, Karnosky and Singer type methodologies remain the benchmark for multicurrency attribution.

In recent years we have seen many attempts to resolve the natural problem caused by the failure of multi-period arithmetic attribution analysis to add up. GRAP (Groupe de Recherche en Attribution de performance) in 1997, Carino in 1999, Menchero in 2000, Kirievsky & Kirievsky also in 2000, Davies and Laker in 2001 and Frongello in 2002 have all published approaches with the common theme of redistributing arithmetic residuals to allow the multi-period Brinson Model to add up, these approaches are collectively termed “smoothing algorithms”. Most smoothing algorithms have been developed by software providers keen to provide attribution analysis that meets their client’s requirements to actually add up, and for the most part have been in use for many years prior to being published.

My initial thoughts that the software providers were reluctant to publish their methodologies because they were embarrassed by their rather crude approach is perhaps unfair, there seems to have a genuine view that they may be able to retain some competitive advantage keeping these methodologies proprietary. Thankfully very few clients are prepared to accept this “black box” treatment today and software providers are required to be extremely open with their methodologies in a wide range of areas. Collectively we will only continue to move forward at pace if firms are prepared to be open, few methodologies are free from criticism and I continue to encourage software providers to be open with their approach, there are plenty of other factors in which they can differentiate their service.

Very little of the work on smoothing algorithms have advanced the field of performance attribution, at best its been a major distraction in the general evolution of methodologies, and at worst a great deal of time and effort has been spent distributing a residual that neither exists in reality or equates to best practice. The problem is simply eliminated by using geometric excess returns.
Whilst geometric excess returns are now generally considered to be technically correct a surprising large number of firms continue to calculate excess returns geometrically internally and present to clients arithmetically. Although the conversion in perceived value from geometric to arithmetic excess returns has been glacial so far, the greater drive for accuracy, increased awareness of the owners of capital ( the increased use of performance fees for pension funds in particular is encouraging a better understanding of the more appropriate definition of excess return ) and the inefficiencies of smoothing algorithms for longer terms calculations should increase the acceptance and understanding of geometric excess returns.

A number of geometric attribution methodologies have been published for some time, notably Bain (1996), Burnie, Knowles & Tyder (1998), and Bacon (2002) all of which have been used for many years prior to publication and again for the most part based on the Brinson Model.

Although endemic the Brinson Model does have it flaws, for straightforward two dimensional (stock & sector analysis) or three dimensional (stock, sector & currency) equity analysis the simple model is adequate. For Fixed Income attribution the Brinson Model is insufficient – by necessity separate tools have been developed to measure the exposure to duration, the changing shapes of yield curves, credit spreads and other relevant factors. Claude Giguere’s recently published article “Thinking Through Fixed Income Attribution- Reflections from a group of French Practitioners” is particularly instructive in highlighting these difficulties.

In addition all attribution models based on the Brinson Model actually overestimate the positive impact of asset allocation. The calculation of asset allocation is based purely on the allocation bet (the difference in portfolio weight and benchmark weight) combined with the differential between the specific sector index return and the overall benchmark. This calculation ignores the transaction costs associated with asset allocation decisions. Traditionally all transaction costs are implicitly included in stock selection, due to the fact that the actual returns of assets in a sector are calculated net of all transaction costs and then compared to the sector index return to calculate stock selection effects. Clearly there are transaction costs associated with asset allocation decisions (which for some asset classes such as Emerging Markets can be quite significant), these costs should be borne by the asset allocator not the stock picker.

The Brinson Model has served asset managers reasonably well over the last twenty years, but will it meet the increasing demands of the industry in the next five, given changing investment styles, the need for absolutely return analysis, coping with more complex instruments, the changing role of performance analysts and the ever increasing demands of a wider range of stakeholders in the investment decision process? What will be the future landscape of performance attribution in response to these demands?

To determine this future landscape we must first pause and establish exactly how performance attribution should be utilised by asset management firms. Although attribution analysis was first developed as an aid for portfolio managers it is equally useful for performance analysts, senior management, client relationship specialists, risk controllers, operations staff and marketing. To date asset managers have failed to recognise the wider uses of performance attribution and have not maximised their considerable investment in performance attribution systems.

Performance analysts have obviously always been key users of performance attribution. In identifying the sources of excess return it is the tool that allows them to add value and participate in the firm’s investment decision process. Performance measurement and analysis is not external to the investment process, it should be part of the decision process, quantifying the active decisions of portfolio managers, identifying structural issues, monitoring consistency and informing senior management.

The performance analyst’s role is changing; clearly they remain responsible for the integrity of the firm’s performance return calculations, for both internal and external use, and for both individual accounts and composites. However, in an outsourcing environment there is the recognition that the role of performance analysts is not one of data scrubbing and ensuring the inputs to the return calculation are accurate, that is clearly the domain of the back office or the third party administrator, but rather that the outputs and analysis are accurate and appropriate to the investment strategy.

Performance analysts must be skilled all-rounders, they need to understand in detail all the investment strategies of the firm, the sources of risk and return and pricing of complex instruments, whilst being able to articulate their analysis to portfolio managers, senior management and clients. Analysts cannot wash their hands of the need for accuracy; the starting point of any analysis must be that the identified sources of return must reconcile to the performance return reported internally or externally to the client, but their real added value is supporting the portfolio managers specifically and the asset management firm generally with informed analysis of the investment decision process.

The boundaries between risk controllers and performance analysts will become increasing blurred, both functions belong in the middle office and in the future environment they will need similar skills and tools. Performance analysts are in fact very effective risk controllers. Senior management and clients are most concerned that the rewards received are worth the risk not just at total fund level but at every step of the decision process. This requires the breakdown of both return and risk in a consistent manner suggesting close collaboration between performance and risk professionals.

Currently, unless the performance measurement function is an adjunct of the back office, the operations function will typically not have used attribution analysis to assist them in their day to day work. This will change, daily attribution analysis is a form of reconciliation or diagnostic tool, and if used properly it will immediately identify potential pricing errors, transaction processing errors and incorrect corporate actions. Clearly such a tool must be transaction based, work at security or instrument level and of course be available daily. Data scrubbing is clearly a back office activity.

All too often in the past the work of performance measurers has been dominated by the accuracy of the return calculation rather than the analysis of risk and return. It is clearly inefficient for a separate function to be responsible for the accuracy of the data of another, the back office should be equipped with the appropriate tools and take responsibility for the accuracy of their own data, performance measurers must upgrade to analysts and take responsibility for the quality of their own analysis. Clearly attribution tools which have in built residuals, such as holding based or buy/hold methods are not suitable as operational tools.

Although the demand for security level analysis has increased significantly over recent years any added value created or insight is rather limited. Security level attribution information is perishable; the portfolio managers may be keen immediately after the end of the month or quarter to verify their qualitative understanding with quantitative analysis and of course it will provide an interesting point of dialogue with clients but in reality it is only of immediate use and the detailed security analysis of a year ago is of little value. Security analysis is best used as a diagnostic tool for the back office, a forensic tool for the middle office and an immediate communications tool for the front office. Regardless of the value some pension funds have the expectation that asset managers have access to security level analysis.

Other potential stake holders in the investment decision process include the dealers at one end and the research analysts at the other. The holy grail of performance analysis would be the measurement of the entire investment decision process from start to finish – unfortunately we are probably more than five years away from this consolidating target – only a few firms are even thinking of doing it. Many firms may measure each part of the process but bringing it all together is difficult to achieve.

Attribution analysis must respond to the wider changes in the investment management industry and meet the challenges posed by more complex instruments, absolute return strategies, private equity and hedge funds. The current accepted definition of attribution; “Performance attribution is a technique used to quantify the excess return of a portfolio against its benchmark into the active decisions of the investment decision process” must be adapted to allow for the analysis for absolute return strategies. Benchmarks are no longer straightforward, they are either heavily customised or legitimately don’t exist at all.

Fundamentally, attribution analysis in the future must differentiate appropriately between alpha and beta. In some senses the traditional Brinson Model does just that, the more common use of the term alpha, incremental returns above the market, equating to stock selection, and beta or market exposure, equating to asset allocation. However, this is an oversimplification the terms alpha and beta should be used in the more proper sense, alpha being the excess return adjusted for systematic risk and beta, systematic risk, high beta reflecting leveraged market exposure and low beta reflecting damped market exposure. Risk-adjusted attribution is not at all well developed but given the need to accurately reflect the divisions between return due to genuine risk adjusted alpha and beta and the need to appropriately reflect the risk exposures of complex instruments much more development is needed in this rather complex area.

Fixed Income attribution is in reality risk adjusted attribution, software providers and asset managers have toiled for many years to meet market requirements. The basic Brinson Model is simply not able to meet the requirements of bond managers who need detailed decomposition of the yield or accurately reflect the risk and return profiles of bonds with embedded options, mortgage backed securities, and interest, inflation and volatility swaps.

The assets managers best prepared for this new environment will be those with clearly delineated front, middle and back offices. The back office or third party administrators equipped with appropriate tools, including daily, transaction based, security level attribution, to enable them to insure the inputs and the return calculation are correct. A middle office including performance and risk professionals equipped with appropriate tools to identify and quantify all sources of return and risk. A front office with access to sufficient analysis to verify their understanding of performance and risk and to articulate the results of their investment process to senior management and clients together with additional scenario analysis, forecasting and back testing tools.

In conclusion, in addition to other major challenges faced by asset managers they must look forward not to continued steady evolution of performance attribution over the next five years, but accelerated revolution responding to a redefinition of attribution, the obsolescence of the Brinson Model in the face of new investment strategies and complex instruments and the growing demands of a wider range of stakeholders in the investment management process. They must also review the best way to deliver attribution analysis in future years. Should asset managers maintain their own systems either built internally or bought off the shelf, should they turn to ASP type service providers or should they outsource the activity completely?

Managers of performance teams in particular will need to prepare for:

1) An upgrade in the role of performance measurers to performance analysts with appropriate higher level skillsets.

2) Increasing collaboration and perhaps combination of the performance measurement and risk control functions.

3) An assessment of the attribution models currently used in the firm. Will minor adjustments to the current model for example interest rate differentials for multi-currency, and estimated transaction costs for asset allocation be sufficient or will a more fundamental change of approach be required.

4) A redefinition of attribution focusing on absolute sources of return and risk.

Software providers will be required to deliver flexible tools that can meet the needs of multiple users with wide ranging requirements, that can price and breakdown the risk and return of increasing complex and esoteric instruments and provide alternatives for the Brinson method which is now obsolete. In the longer term software providers will be tasked with consolidated analytical tools extending attribution analysis to include research and dealing activities and effectively combining ex-post and ex-ante risk measurement with performance attribution.

Fasten your safety belts, the pace of change will be quicker and the bends in the road ahead will be more frequent and certainly more severe.


To learn more about 13f-2 watch our webinar replay Part 1: Unpacking the SEC's New Disclosure Rules for Shareholders
Join us for Part 2: Operationalizing the SEC's New Disclosure Rules, for Shareholders on December 12.