Hierarchical Risk Parity: Efficient Portfolio Construction with Graph Theory

Author

Reads 408

Illustration of man carrying box of financial loss on back
Credit: pexels.com, Illustration of man carrying box of financial loss on back

Hierarchical risk parity is an innovative approach to portfolio construction that leverages graph theory to create more efficient and diversified investment portfolios.

By applying graph theory, hierarchical risk parity can identify the most important assets and their relationships, allowing for a more nuanced understanding of risk and return.

This approach is particularly useful for large portfolios with many assets, as it can help to reduce complexity and improve performance.

Hierarchical risk parity has been shown to outperform traditional risk parity methods in various studies, with one study demonstrating a 20% improvement in portfolio efficiency.

Wrangle Data

To wrangle data for hierarchical risk parity, start by reading in the datasets and aggregating them into a dataframe called m, then drop any equities with null values.

This step is crucial because some securities behave poorly, staying flat for years or containing non-positive values, which can skew the data. Removing these bad securities helps ensure a more accurate analysis.

Credit: youtube.com, Hierarchical Risk Parity

Next, split the dataframe m into two parts: a training and testing period. The training period spans from November 2018 to November 2020, while the testing period occurs from November 2020 to November 2021.

Creating a log returns matrix for the training data is essential, as it allows you to obtain the correlation and covariance matrices using built-in cuDF methods.

Risk Parity Strategies

Hierarchical risk parity (HRP) is a risk parity strategy that uses machine learning techniques to optimize portfolio construction. It was introduced by Marcos López de Prado in 2016.

The HRP algorithm consists of three key stages: clustering, quasi-diagonalization, and recursive bisection. These stages work together to group assets based on their correlations and assign weights to each cluster.

Assets are grouped based on their correlations using hierarchical clustering methods like single, average, or complete linkage. This clustering structure is then used to reorder the covariance matrix, concentrating covariances around the diagonal.

Credit: youtube.com, Asset Allocation - Hierarchical Risk Parity

Weights are assigned recursively across clusters in a top-down manner. This approach allows equities to compete only with similar equities for spots in the portfolio.

HRP can be used to construct the final asset allocation by converting continuous weights into discrete share quantities. This is done using the pypfopt.discrete_allocation.DiscreteAllocation class, which treats the conversion as an integer programming problem.

The goal is to minimize the remaining unallocated value while also minimizing the difference between the target dollar allocation and the actual allocation. This is achieved by minimizing the sum of the absolute values of the elements in the vector.

The HRP algorithm can be used to build diversified portfolios that perform well out-of-sample. By grouping assets based on their correlations, HRP can help investors reduce risk and increase returns.

Here's a brief overview of the HRP algorithm:

By using HRP, investors can create more efficient and diversified portfolios that are better equipped to handle market volatility.

Clustering and Weight Computation

Credit: youtube.com, Hierarchical Risk Parity (HRP)

Clustering and Weight Computation is a crucial step in the Hierarchical Risk Parity algorithm. The first step, Hierarchical clustering of the assets, groups similar assets together based on their pairwise correlations using a hierarchical clustering algorithm.

Four hierarchical clustering algorithms are supported in Portfolio Optimizer: single linkage, complete linkage, average linkage, and Ward's linkage. Single linkage is the default algorithm used in the original paper.

The algorithm then recursively bisects the reordered assets correlation matrix and computes assets weights in inverse proportion to both their variance and their clusters variance. This step is also where constraints on assets weights might be incorporated.

The linkage function from scipy is used to perform the clustering, and the "appropriate" linkage method depends on the data characteristics. The clustering process can be visualized using a dendrogram, which shows the hierarchical structure of the clusters.

Here are the supported leaf ordering methods in Portfolio Optimizer:

The weights computed in this step are inversely proportional to the amount of risk or variance of the stock, so a high-risk stock has low representation while a low-risk stock has high representation.

Clustering

Credit: youtube.com, Clustering with DBSCAN, Clearly Explained!!!

Clustering is a crucial step in the Hierarchical Risk Parity algorithm, and it's all about grouping similar assets together based on their pairwise correlations. The algorithm uses a hierarchical clustering algorithm to achieve this, and there are several methods to choose from, including single linkage, complete linkage, average linkage, and Ward's linkage.

In Portfolio Optimizer, four hierarchical clustering algorithms are supported, with single linkage being the default. However, the performances of the Hierarchical Risk Parity algorithm are reported as deteriorating when single linkage is not used, but Ward's linkage is used in the computation of the institutional index FIVE Robust Multi-Asset Index.

A dendrogram is a great way to visualize the clustering process, and it's a tree-like diagram that shows the hierarchical structure of the clusters. The dendrogram is formed by recursively merging the two closest clusters together until there is only one cluster left.

Here are the four hierarchical clustering algorithms supported in Portfolio Optimizer:

  • Single linkage
  • Complete linkage
  • Average linkage
  • Ward's linkage

It's worth noting that the choice of linkage method can have a significant impact on the performance of the algorithm, and evaluation mechanisms can help determine the best choice for a given dataset.

Daily Returns

Credit: youtube.com, Hierarchical Clustering

Daily returns are a crucial aspect of analyzing financial data. They measure the change in asset price from one day to the next.

To calculate daily returns, we use the formula: Daily Return = (Current Price - Previous Price) / Previous Price. This formula is essential for understanding how assets are performing over time.

The article section provides a table with daily returns for various assets, including IYF, IYK, IYR, IYW, VGT, and XLU. These returns are calculated using the formula mentioned earlier.

For example, on 2019-10-28, the daily return for IYF was 0.0019819. This means that the price of IYF increased by 0.0019819 from the previous day.

Here's a snapshot of the daily returns for each asset on the specified dates:

By analyzing these daily returns, we can gain insights into the performance of each asset over time.

Matrix Operations

Matrix operations play a crucial role in hierarchical risk parity.

Hierarchical risk parity involves the use of matrix operations to transform and combine risk exposures.

Credit: youtube.com, Capstone Project: Portfolio Optimisation Using Hierarchical Risk Parity

The concept of matrix operations is rooted in linear algebra, where matrices are used to represent relationships between variables.

In the context of hierarchical risk parity, matrices are used to represent the covariance between different assets.

The covariance matrix is a square matrix that represents the variance and covariance between different assets.

A covariance matrix can be calculated using the formula: Σ = (1/n) \* X^T \* X, where X is the matrix of asset returns and n is the number of observations.

The resulting covariance matrix is used as input for the hierarchical risk parity algorithm.

The algorithm uses matrix operations to transform the covariance matrix into a risk parity portfolio.

Matrix operations such as transpose, inverse, and multiplication are used to manipulate the covariance matrix.

These operations are used to calculate the risk parity weights for each asset in the portfolio.

The risk parity weights are then used to construct the final risk parity portfolio.

Evaluation and Comparison

Credit: youtube.com, Quantum Hierarchical Risk Parity by Maxwell Rounds at QuantCon NYC 2017

The HRP portfolio allocation strategy can be evaluated using backtesting, a technique that assesses an investment strategy's performance using historical data. Backtesting allows for an assessment of how reliably the strategy could be replicated in real-world scenarios.

Three primary backtesting methods are outlined by Joubert et al., 2024: Walk-Forward Backtest, Resampling, and Monte Carlo Simulation. The Walk-Forward Backtest is easy to implement but risks overfitting, while Resampling provides more robust analysis by generating multiple scenarios.

The HRP strategy can be compared against other portfolios, such as Modern Portfolio Theory (MPT). MPT aims to maximize the Sharpe ratio, a metric that tracks the performance of a portfolio. The Sharpe ratio is calculated as the difference between the portfolio's return and the risk-free rate, divided by the portfolio's risk.

HRP outperforms MPT in the testing period, with a Sharpe ratio of 1.51 compared to MPT's -0.18. This demonstrates the effectiveness of the HRP strategy in real-world scenarios.

Evaluation Strategy

Risk Management Chart
Credit: pexels.com, Risk Management Chart

To evaluate the performance of a portfolio allocation strategy, backtesting is a crucial technique that uses historical data to assess how reliably the strategy could be replicated in real-world scenarios. This involves splitting historical data into multiple segments and training the strategy on one segment while testing it on the next.

The Walk-Forward Backtest is a simple approach that splits historical data into segments, trains the strategy on one segment, and tests it on the next. However, this method only tests a single path, risking overfitting, and past performance may not predict future results.

Resampling is a more robust approach that creates multiple samples from historical data to assess performance. This can be done using Bootstrap, which draws samples with replacement from historical data, or Cross-Validation, which splits the data into multiple training and testing sets.

Here are the three primary backtesting methods outlined by Joubert et al., 2024:

For our evaluation, we will be using time-series bootstraps tools from the arch package to optimize the hierarchical clustering linkage method.

Compare to Other Portfolios

Credit: youtube.com, How to Compare Active vs Passive Funds (FBALX vs VBIAX)

Comparing HRP to other portfolios is a great way to understand its strengths and weaknesses. HRP was compared to Modern Portfolio Theory (MPT), which is a well-established method for building diversified portfolios.

MPT was tested using a Sharpe ratio, a metric that measures a portfolio's performance by considering both risk and return. To do this, the risk-free rate of return was used, and the risk was defined as the covariance matrix. This resulted in a complex numerical optimization problem, especially with over 4,000 assets.

The Sharpe ratio for MPT was maximized during the training period, but it turned negative during the testing period, implying that the portfolio underperformed the risk-free rate. This is a common problem with MPT, known as Markowitz's curse.

Here's a comparison of the Sharpe ratios for different portfolios during the training and testing periods:

As you can see, HRP performed well during both the training and testing periods, with a Sharpe ratio of 1.51. In contrast, MPT underperformed during the testing period, with a negative Sharpe ratio. IVP also performed well during the testing period, but it had a higher risk than HRP.

Portfolio Analysis

Credit: youtube.com, Course Introduction | Portfolio Management using Machine Learning : Hierarchical Risk Parity

Hierarchical risk parity (HRP) is a portfolio optimization method that can result in less risk compared to inverse variance and a higher Sharpe ratio than modern portfolio theory.

The bootstrapped performance metrics for HRP are stored in long format, with the linkage_method column indicating the hierarchical clustering method used. Each bootstrapped sample contains the annualized expected return, volatility, and Sharpe ratio, computed over a resampled set of all T observations from the original time series.

Here are some key statistics from the bootstrapped performance metrics:

HRP can be a viable portfolio optimization tool for a comparatively low computation cost, thanks to the GPU speedup provided by RAPIDS.

Verify Results with PyPortfolioOpt

You can use the PyportfolioOpt package to verify the results obtained from your manual implementation of the Hierarchical Risk Parity (HRP) algorithm. This package provides a convenient implementation of the HRP algorithm.

The PyportfolioOpt package can be used to compare the results obtained from your manual implementation with its own implementation, ensuring the correctness of your approach.

Credit: youtube.com, Master Portfolio Optimization in Python: Practical Guide to Markowitz, CAPM & Multi-Factor Models

To verify the results, you can use the PyportfolioOpt package's implementation of the HRP algorithm and compare it with your own implementation.

The PyportfolioOpt package's implementation of the HRP algorithm can be used to confirm the accuracy of your manual implementation.

By using the PyportfolioOpt package, you can ensure that your manual implementation of the HRP algorithm is producing the correct results.

Here's a comparison of the results obtained from the manual implementation and the PyportfolioOpt package's implementation:

Analyze Speed

Analyzing speed is crucial in portfolio analysis, and it's impressive to see a 66x speedup when running on a GPU over a CPU for the maximum number of securities.

This significant speedup is achieved by utilizing libraries such as SciPy, pandas, or NumPy, which can re-create the algorithm and take advantage of parallelization capabilities of a GPU.

As the number of securities increases, so does the computing power required, but this also means you can harness the power of a GPU for faster results.

Even in the worst-case scenario, you still get a 4x speedup by running on a GPU, which is no small feat.

This highlights the importance of choosing the right tools and hardware for your portfolio analysis tasks to get the most out of your computing resources.

Frequently Asked Questions

What is the HRP model in finance?

The HRP model is a portfolio diversification technique that organizes assets into a hierarchical tree structure, allowing for flexible weight assignments. This structure enables investors to create a diversified portfolio with tailored risk management and return expectations.

Doyle Macejkovic-Becker

Copy Editor

Doyle Macejkovic-Becker is a meticulous and detail-oriented copy editor with a passion for refining written content. With a keen eye for grammar, syntax, and clarity, Doyle has honed their skills across a range of article categories, including Retirement Planning. Their expertise lies in distilling complex ideas into concise, engaging prose that resonates with readers.

Love What You Read? Stay Updated!

Join our community for insights, tips, and more.