Tuesday, March 29, 2011

The Leverage Space Trading Model

I finally got around to reading Ralph Vince’s latest The Leverage Space Trading Model (for a brief summary see this magazine article in Futures), and I’m happy to say that the book was very helpful in approach and example.  I especially enjoyed the last two chapters which tied his method to the realities of the money management business which do not fit most economic models.

The book’s true value will reveal itself hopefully through my ability to incorporate the material.  As usual with R, some fine folks at http://www.fosstrading.com and http://automated-trading-system.com have already built a package and provided examples on its use.  To extend their examples, I wanted to apply this Leverage Space Model to the EDHEC Hedge Fund Returns provided in the PerformanceAnalytics R package and then use this PerformanceAnalytics package to produce some decent charts illustrating the output.

From TimelyPortfolio
From TimelyPortfolio
From TimelyPortfolio

Since the book uses $ amounts instead of percentages, I am not entirely sure I got this correct, but it is certainly close.  As always, please let me know if I messed up here or anywhere.

After your comments and my own improvements, I hope to extend this to basic systems and show how we can tie multiple managers, systems, or indicies together with the Leverage Space approach.  Also, I would like to compare this with other optimization methods to see which are more robust.

 

R code:

#Please see au.tra.sy blog http://www.automated-trading-system.com/
#for original code and http://www.fosstrading.com
#I take no credit for the majority of this code
#I simply changed a couple of things to use edhec data
#as another example but this time using xts style returns

require(PerformanceAnalytics)

# Set Walk-Forward parameters (number of periods)
optim<-48 #4 years = 48 monthly returns
wf<-6 #walk forward 6 monthly returns
# get data series from PerformanceAnalytics edhec dataset
data(edhec)
rtn<-edhec[,1:13]*100
# Calculate number of WF cycles
numCycles = floor((nrow(rtn)-optim)/wf)
# Define JPT function
jointProbTable <- function(x, n=3, FUN=median, ...) {
  # Load LSPM
  if(!require(LSPM,quietly=TRUE)) stop(warnings())
  # Function to bin data
  quantize <- function(x, n, FUN=median, ...) {
    if(is.character(FUN)) FUN <- get(FUN)
    bins <- cut(x, n, labels=FALSE)
    res <- sapply(1:NROW(x), function(i) FUN(x[bins==bins[i]], ...))
  }
  # Allow for different values of 'n' for each system in 'x'
  if(NROW(n)==1) {
    n <- rep(n,NCOL(x))
  } else
  if(NROW(n)!=NCOL(x)) stop("invalid 'n'")
  # Bin data in 'x'
  qd <- sapply(1:NCOL(x), function(i) quantize(x[,i],n=n[i],FUN=FUN,...))
  # Aggregate probabilities
  probs <- rep(1/NROW(x),NROW(x))
  res <- aggregate(probs, by=lapply(1:NCOL(qd), function(i) qd[,i]), sum)
  # Clean up output, return lsp object
  colnames(res) <- colnames(x)
  res <- lsp(res[,1:NCOL(x)],res[,NCOL(res)])
  return(res)
}
for (i in 0:(numCycles-1)) {
            # Define cycle boundaries
            start<-1+(i*wf)
            end<-optim+(i*wf)
            # Get returns for optimization cycle and create the JPT
            jpt <- jointProbTable(rtn[start:end],n=rep(10,13))
            outcomes<-jpt[[1]]
            probs<-jpt[[2]]
            port<-lsp(outcomes,probs)
            # DEoptim parameters (see ?DEoptim)
            np=130       # 10 * number of mktsys
            imax=1000       #maximum number of iterations
            crossover=0.6       #probability of crossover
            NR <- NROW(port$f)
            DEctrl <- list(NP=np, itermax=imax, CR=crossover, trace=TRUE)
            # Optimize f
            res <- optimalf(port, control=DEctrl)
        # use upper to restrict to a level that you might feel comfortable
            #res <- optimalf(port, control=DEctrl, lower=rep(0,13), upper=rep(0.2,13))

    # these are other possibilities but I gave up after 24 hours
        #maxProbProfit from Foss Trading
        #res<-maxProbProfit(port, 1e-6, 6, probDrawdown, 0.1, DD=0.2, control=DEctrl)
        #probDrawdown from Foss Trading
        #res<-optimalf(port,probDrawdown,0.1,DD=0.2,horizon=6,control=DEctrl)

            # Save leverage amounts as optimal f
        # Examples in the book Ralph Vince Leverage Space Trading Model
        # all in dollar terms which confuses me
        # until I resolve I changed lev line to show optimal f output
            lev<-res$f[1:13]
            levmat<-c(rep(1,wf)) %o% lev #so that we can multiply with the wfrtn
            # Get the returns for the next Walk-Forward period
            wfrtn <- rtn[(end+1):(end+wf)]/100
            wflevrtn <- wfrtn*levmat #apply leverage to the returns
            if (i==0) fullrtns<-wflevrtn else fullrtns<-rbind(fullrtns,wflevrtn)
        if (i==0) levered<-levmat else levered<-rbind(levered,levmat)
}

#not super familiar with xts, but this add dates to levered from the wflevrtn xts series
levered<-xts(levered,order.by=index(fullrtns) )
chart.StackedBar(levered, cex.legend=0.6)

#just the first six in the series as another example
#I had to fill the window to my screen to avoid a error from R on margins
par(mfrow=c(6,1))
for (i in 1:6) {
chart.TimeSeries(levered[,i],xlab=NULL)
}

charts.PerformanceSummary(fullrtns, main="Performance Summary with Optimal f Applied")

2 comments:

  1. Nice post.

    I have a small suggestion - don't use jointProbTable(), use lsp() instead. I only did a quick look through, so please excuse any numerical errors!

    You have only 48 rows of data in your raw returns file. You are bucketing with n=3 and you have 13 systems. I have not looked, but I would wager that the joint probability table generated by jointProbTable() is almost as large if not the same size (the largest it can be is equal to you raw data table) as your raw data table, but you have reduced the information content by taking 48 individual returns for each system and assigned one of only 3 values to each.

    Instead, try using the raw data as it stands as your joint probability table - each row has a probability of 1/48. Simply stick your raw data table through the lsp() function to turn it into the lsp object needed by the optimalf() function.

    The computation cost will be similar but the information content will be much higher!

    ReplyDelete
  2. You are exactly right. I did not even think about that. Thanks so much. I have learned a lot since this initial post.

    I guess I will still run into the problem if there are no negative returns and will need to override the lowest and make it negative.

    Sounds like you are doing good things with LSPM. I would love to see them.

    ReplyDelete