Skip to content

Proxy / Cross Hedging

October 2, 2011

The root challenge of two current equity risk and alpha projects boil down to hedging using non-underlying instruments, known as proxy hedging or cross hedging. This technique is useful for equity shaping trades, as well as an underlying principle for both long / short and statarb:

  • Price exposure: neutralize price exposure for stock or basket, possibly leaving behind useful residual (such as dividends or rights)
  • Market exposure: neutralize market exposure for a stock or basket, leaving just idiosyncratic exposure

Trading these hedges in practice is more difficult than standard texts suggest (e.g. Hull), as the real world rarely satisfies theory: market incompleteness (challenging risk-neutral \mathbb{Q} models), stochastic covariation (challenging static models), and non-linear impulse response (challenging linear models).

This challenge is explored here via a multi-part series (see Empirical Quantiles and Proxy Selection, Empirical Copulas and Hedge Basis Risk, Lag Dynamics with Autocopulas, Exploratory Hedge Analysis, Proxy Hedging and Dependence, and Proxy Conditional Model Selection), including R code and real data. This first post presents the basic model. The second post will apply this model to a few well-known equities. Readers are encouraged to comment on improvements or alternative techniques on all posts, as this problem is real and remains an open research topic.

Proxy Hedge Model

Begin with the textbook mathematical model. This proxy hedge model is composed of two instruments u and h, long underlying and short hedge respectively, whose weighted sum generates a residual \epsilon :

   u - \beta h = \epsilon

Where the hedge ratio \beta determines the proportional amount of hedge h per unit underlying u. Although not modeled explicitly, the joint relationship of both sign and magnitude between u and h is important and represented as \rho (i.e. correlation, but more generally copula). Both \epsilon and \rho are time-series whose values are stochastic with unknown distributions.

The purpose of this model is to shape the residual to generate P&L on the trade, also known as the basis risk. A \epsilon {\rightarrow} 0 means price exposure to u is neutralized. When h is a market index, non-zero \epsilon is equivalent to neutralizing market exposure and leaving idiosyncratic exposure specific to u (an exotic beta, of sorts). In both cases, the behavior of \epsilon is asymptotic on the \mathbb{P} measure. Traditionally, futures were the preferred instrument for h. More recently, ETFs are becoming increasingly interesting given their diversity (including levers).

Although the math is pedestrian, it belies two serious research challenges:

  • Instrument selection: choice of instrument for h
  • Ratio calculation: choice of algorithm to calculate \beta

Both are explored below.

Instrument Selection

Deciding the universe of h instruments to explore is more art than science given hedges are assumed, by definition, to not be derivatives on u. Hence, the standard \mathbb{Q} machinery is unavailable. Thus, significant exploratory analysis on the marginal and joint behavior of u and h is warranted.

One traditional answer is to seek out instruments which maximize Pearson correlation \rho. This turns out to be fairly naïve: despite \beta expressing a linear relationship, that does not necessarily imply measurement of the uh relationship should be constrained to linearity (as Pearson expresses only linear sensitivity). A better answer is to follow economic intuition and identify instruments which have strong fundamental reasons why they covary.

Ratio Calculation

Techniques for calculating \beta have been subject to intense research over recent decades. Two recent summary articles are de Prado and Leinweber (2011) and Alexander and Barbosa (2007).

For analysis here, the Box-Tiao Canonical Decomposition (BTCD) method from de Prado (2011) is selected. In short, this approach maximizes a measure of predictability generalized from Box and Tiao (1977) on the combined portfolio of u and h:

   \lambda_\Omega = \frac{\Omega'\beta'\Omega\beta\Omega}{\Omega'\beta\Omega}

Whose solution based upon a generalized Rayleigh quotient form, in \Omega, is:

   \hat{\Omega} = \Gamma^{-1}z

This method is particularly useful as it permits use of an arbitrary forecasting model, instead of limiting explanatory variables to forecast variables lagged by a single period (as is common with standard techniques).

The following is R code for BTCD using a standard vector autoregression (VAR) forecasting model (with Ledoit and Wolf (2004) covariance shrinkage), with minor simplification from the derivation provided in de Prado (p. 8):


btcdHedge <- function(p, start=1, interval=4)
  # Generate hedge using BTCD method, as defined by de Prado [2011].
  # Args:
  #   p: matrix of instrument price data
  #   start: index into p, which to begin generating hedge
  #   interval: number of periods used to calibrate hedge
  # Returns: BTCD hedge ratio vector
  end <- start+interval
  pvar <- VAR(p[start:end,], p=1, type="none")
  varfit <- fitted(pvar)
  B <- cov.shrink(p)        # shrink covariance
  A <- t(varfit) %*% varfit
  C <- chol(B)
  CInv <- solve(C)
  D <- t(CInv) %*% A %*% CInv
  eigens <- eigen(D)
  z <- eigens$vectors[,length(eigens$values)]
  x <- CInv %*% z
  hedge <- x/x[1]
  # perform sanity check
  tx <- t(x)
  num <- tx %*% A %*% x
  denom <- tx %*% B %*% x
  check <- num / denom
  if ((check - eigens$values[length(eigens$values)]) > 0.0001)
    message("failed sanity check")
  return (hedge)

This algorithm requires input of the duration of preceding contiguous time window over which the hedge is calculated, via the interval argument. As with any window-based algorithm, the interval represents a tradeoff: shorter intervals are more responsible, but more noisy; longer intervals are smoother, but less responsive. Thus, selection of interval length is more art than science.

21 Comments leave one →
  1. Anonu permalink
    October 3, 2011 6:10 am

    how about leveraging mean-variance optimization to come up with an optimal hedge? The variables that you can manipulate are the risk factor/covariance model and the specific form of the utility function (where you can incorporate impact costs, etc… ) ? You can provide the optimizer a large universe of stocks and let it do the number crunching given your level of risk aversion and impact you are willing to take on…

    • quantivity permalink*
      October 3, 2011 11:25 pm

      @Anonu: yes, interesting potential approach; to clarify, how do you propose defining the objective function?

      While not having conducted this analysis, my a priori intuition is that covariance instability will make the hedge ratio jump around erratically—similar to what occurs with Markowitz optimization, likely amplified by the constraint of only two instruments.

  2. January 18, 2012 10:11 pm

    Forget an optimal hedge, how about trying to estimate a certain % of the tail risk and hedge only that.

    • January 18, 2012 10:21 pm

      Riffing here, you could try to predict % cointegration, risk of divergence, and risk of loss in what you’re holding — a combination of those. E.g. are they cointegrated when you could lose money.

      • quantivity permalink*
        January 18, 2012 10:31 pm

        @Lao Tzu: uncertain whether cointegration is sufficient, as unit root implies linearly and temporally stable beta. Many proxy hedging problems do not possess these two properties.

      • January 19, 2012 3:50 pm

        maybe use a more “robust” type measure, all I’m saying is you only care if the hedge works at certain times so define a loss function that covers only those times

    • quantivity permalink*
      January 18, 2012 10:24 pm

      @Lao Tzu: while conceptually makes sense, practical problem is bleed during 99% of non-tail days: either bleed long index puts or suffer upside hedge drag due to asymmetry of upside and downside betas. Neither is appealing.

  3. China Kung Fu permalink
    April 22, 2013 9:38 pm

    Generally I do not learn article on blogs, however I would like to say that this write-up very pressured me to
    take a look at and do it! Your writing style has been surprised
    me. Thank you, quite great post.

  4. commodity trade statistics permalink
    May 19, 2013 1:50 am

    Excellent post however I was wondering if you could write
    a litte more on this subject? I’d be very grateful if you could elaborate a little bit more. Cheers!

    • quantivity permalink*
      May 19, 2013 8:15 am

      Please clarify which aspect you are interested to see elaboration, as this is a large topic.


  1. Empirical Quantiles and Proxy Selection « Quantivity
  2. Empirical Copulas and Hedge Basis Risk « Quantivity
  3. Finanzas 101: Proxy Hedging « Quantitative Finance Club
  4. Lag Dynamics with Autocopulas « Quantivity
  5. Exploratory Hedge Analysis « Quantivity
  6. Proxy Hedging and Dependence « Quantivity
  7. Proxy Conditional Model Selection « Quantivity
  8. Index Return Decomposition « Quantivity
  9. Index Return Decomposition « Quantivity
  10. Return Decomposition via Mixing « Quantivity

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: