Recent research by Deutsche Bank helps pension funds understand differences between the models commonly used by asset managers and advisors to allocate their assets
Deutsche Bank (DB) recently published a substantial piece of work analysing different approaches to portfolio construction which makes a useful contribution to the existing material available on the issue.
http://pull.dbgmresearch.com/cgibin/pull/DocPull/64FF1A/74067871/0900b8c086a8e89a.pdf
As DB rightly point out, historically there has been a focus among the institutional investor community on chasing “alpha adding” ideas such as individual trades, asset rotational theories or identifying star asset managers.
While these can have an important influence on a pension scheme’s position, it has become increasingly apparent since 2008 that the less focusedon (and perhaps less glamorous) areas of portfolio construction and risk attribution can also exert a big influence on the investment outcomes for a pension scheme : for example, most investment professionals would agree that diversification between asset classes and sources of return is a good thing, but there are a number of competing ways to achieve this, and the standard theories accepted precrisis were challenged by a situation where a broadbased liquidity contraction caused many asset classes and “alternative” strategies to plunge in value simultaneously.
In the research DB consider a number of riskbased approaches to portfolio allocation (here is a high level description of riskbased investing ideas), looking in detail at a few competing models and ideas that lead to some interesting and useful conclusions.
Here we summarise the main parts of the analysis and argue that there are some clear takeaways for pension funds and their approaches to asset allocation and portfolio construction.
Main takeaways
 There are a number of ways that a risk based approach can be implemented in practice, the work suggests that an overengineered approach can lead to exchanging model risk for estimation risk
 That is, we might come up with a better risk model (compared to a standard volatility estimate, for example) but the parameters become hard to estimate
 The work shows that allocating to assets on the basis of the inverse of their volatility improves the riskadjusted return compared to conventional benchmarks
 However this can be materially improved by investing in assets on the basis of their overall risk contribution, taking into account their correlation with other assets (which is what most risk parity managers would do)
 More sophisticated risk measures include measures of tail dependence can be used for allocation, these are shown to lead to asset allocations similar to risk parity, but with greater noise in the allocations
 Diversification is not the same thing as the presence of many different assets
 Surprisngly, a multiasset portfolio which is constructed on the basis of maximising diversification in a mathematically defined sense actually appears optically to not be very diverse (up to 70% in US Treasuries since 2007, with the balance in US equity and US high yield)
 This is a consequence of rising correlation among asset classes
 Should cause pension funds to rethink what they understand by diversification
 Risk parity improves Sharpe ratio in a multiasset portfolio (compared to a fixed weight equity/bond benchmark), although admittedly the multiasset results are driven by the allocations to bonds in the various portfolios (given the time period chosen corresponds to a large bond market rally)
 A better illustration of the effectiveness of risk parity is the study of different equity regions, sectors, and stocks, where risk parity approach shows a significant Sharpe ratio benefit of around 0.20.3 across various regions, compared to the equivalent market capitalization benchmark

 Particularly interesting is the case of Japanese equity, where the benchmark gives a negative return over the period, but all the risk based portfolio allocations show positive sharpe ratios of 0.3 or higher
 Commodities are not studied here, but other research suggests a similar result is true for risk parity allocations within commodity sub sectors
 This is equivalent to the sharperatio pick up associated with multiasset class risk parity in longer term studies (for example this one by AQR)
 Minimum variance portfolio methods, which dominate much of the literature, can result in portfolios that become dominated by allocations to a very small number of the assets in the universe (as few as 2 or 3), while this can mathematically minimize the variance (and may appear attractive in variancebased risk adjusted return measures), MinVar portfolios can show greater tail dependence than other portfolios
 On the other hand, if the universe of assets is arbitrarily or badly chosen, then allocating to all the assets in the universe on a risk parity basis will not necessarily lead to a good outcome – simply applying risk parity within a large and arbitrary universe doesn’t make sense
 In fact, we see that for multi asset Risk Parity, the asset universe selection is crucial, building Risk Parity from 11 assets from different asset classes will introduce unintended overall biases in risk (often growth / equity risk will dominate) – this is supported by other work on the subject for example this paper
 When looking at universes of a very large number of assets (say several hunderd equity stocks, risk parity begins to “look like” an equally weighted portfolio
 The takeaway here is that risk parity is most effective when employed between a relatively small number of asset classes, or regional sub assetclasses
 Risk based allocations results in portfolios with lower tail dependence, and higher diversification characteristics, than the fixed weight or market capitalization portfolios used as benchmarks
Notes on methodology
The paper studies six different approaches to asset allocation, which are mainly inspired by current practice and literature on the subject, they are :
 Equal weighting
 An equal capital weight to each asset. No dependence on return or volatility assumptions
 Inverse volatility
 Allocations are made to assets on the basis of the inverse of that asset’s volatility, without reference to other assets
 Risk Parity
 Allocations are made to each asset such that each asset contributes an equal amount to the portfolio’s risk, taking into account its relationship (or correlation) with other assets
 Global Minimum variance
 All asset volatility and correlation estimates are fed into an optimizer at each point in time and the overall minimum variance portfolio constructed
 Efficient frontier
 Max diversification
 This portfolio is built to maximise the ratio between the “naive” sum of the volatilities of the underlying assets and the actual volatility of the portfolio. Thereby achieving a portfolio that has the maximum amount of volatility benefit arising from diversification
 Minimum tail dependence
 Similar to maximum diversification, but using a different measure, rather than the volatility this uses a measure that quantifies the tail risk dependence between the assets in the portfolio, and aims for an allocation that maximises the diversification benefits in terms of this property
The paper applies these seven allocation approaches to a few different problems, namely :
 Allocation between asset classes
 Allocation to equity regions within equities
 Allocation to equity sectors within US equities
 Allocation to different equity stocks within given regions (US, Europe, Japan)
Also interesting to look at (not studied in the paper) would be allocation between commodity subsectors (energy, precious metals, industrial metals, agricultrals) and between individual commodity contracts within a sub sector..
The results are presented in some quite innovative and intuitive styles, as well as the more well known and generic:
 Risk adjusted returns are analysed using the sharpe ratio
 Risk is also considered in terms of CVaR / expected shortfall, in order to allow for tail risks
 Relationships between the different portfolio allocation approaches are characterized using a “dendrogram” or cluster analysis, which intuitively groups similar approaches into families using a quantitative system
 The stability of the underlying allocations are analyzed through time