Tuesday, July 31, 2012

“We Will Do Whatever it Takes”

When I first heard Draghi’s quote “We Will Do Whatever it Takes,”  I immediately thought that I should dig through some other good quotes from similarly positioned officials in financial history.  Let’s start with Prime Minister John Major in England 1992.

I found this clip reminding us of the 1992 British Pound devaluation.

http://news.bbc.co.uk/onthisday/hi/dates/stories/september/16/newsid_2519000/2519013.stm “The move is a dramatic U-turn in government policy, as only last week Prime Minister John Major reaffirmed the government's commitment to remaining within the mechanism.” (see his strong words below at the 2:34 mark)

Another near real-time view of Black Wednesday, September 16, 1992.

I think the last lines from the Atlantic June 4, 2010 article “Go For the Jugular” sum it up very well

“That evening, Lamont called a press conference in the Treasury's central courtyard. At 7:30 p.m., facing a massive battery of TV cameras from all over the world, he announced Britain's exit from the exchange-rate mechanism.

The markets had won, and the government had at last recognized it.

Application of Horizon Plots

for background please see prior posts Horizon Plot Already Available and Cubism Horizon Charts in R

Good visualization simplifies, and stories are better told with effective and pretty visualizations.

Although horizon plots are not immediately intuitive, I have embraced them as an extremely effective method of analyzing more than four series.  I hope they become much more popular, so I can use them with much more confidence.  If we look at a traditional cumulative growth chart on the managers dataset provided by PerformanceAnalytics, I get confused by too many lines and colors since there are 10 different series.  While this chart works, it can be better.

From TimelyPortfolio

We could panel the data, but I think this makes comparison even more difficult.

From TimelyPortfolio

In this case and many others, horizon plots provide what I feel to be both a more attractive and effective visualization.  Here is an example using latticeExtra’s horizonplot function with very little adjustment.  You can detect both comovement or seasonality and can compare the amplitude simultaneously.

From TimelyPortfolio

With a little additional formatting, we can get an ideal visualization-pretty and effective.  The ability to scale well beyond 10 series offers power that we cannot obtain with a traditional line chart.

From TimelyPortfolio

As another example, let’s look at how we can use horizon plots to monitor a moving average system similar to the Mebane Faber's timing model.  If you follow the link, you can see a decent visualization of the price and moving average.  A horizon plot could accomplish this much more efficiently.

From TimelyPortfolio

I personally like the mirrored horizon plot even better.  Let’s incorporate that.

From TimelyPortfolio

Please help me popularize these extremely powerful charts.

R code from GIST (do raw for copy/paste):

require(lattice)
require(latticeExtra)
require(directlabels)
require(reshape2)
require(quantmod)
require(PerformanceAnalytics)
data(managers)
managers[which(is.na(managers),arr.ind=TRUE)[,1],
unique(which(is.na(managers),arr.ind=TRUE)[,2])] = 0
testprice <- cumprod(1+managers)-1
testdf <- as.data.frame(cbind(index(testprice),coredata(testprice)),stringsAsFactors=FALSE)
testmelt <- melt(testdf,id.vars=1)
colnames(testmelt) <- c("date","series","growth")
testmelt[,"date"] <- as.Date(testmelt[,"date"])
#just plain old xyplot from xts package examples
direct.label(
xyplot(testprice,
lwd=2,
screens=1,
col = c(brewer.pal(n=8,"Dark2")[1:6],brewer.pal(n=9,"PuBu")[5:9]),
panel = function(x, y, ...) {
panel.xyplot(x, y, ...)
},
scales = list(tck = c(1,0), y = list(draw = TRUE,relation = "same", alternating = FALSE)),
xlab = NULL,
main="Performance Since 1996 or Inception"),
list(last.bumpup,hjust=0.75, cex=0.8))
#get panel in row 1 and column 1, since only one panel because screens = 1
trellis.focus("panel", 1, 1, highlight = FALSE)
panel.refline(h = pretty(coredata(testprice)), col = "gray70", lty = 3)
#does not even qualify but here as another example
xyplot(testprice,
scales = list(tck = c(1,0), y = list(draw = TRUE,relation = "same", alternating = FALSE)),
panel = function(x, y, ...) {
panel.grid(col = "grey", lty = 3)
panel.xyplot(x, y, ...)
},
layout= c(1,NCOL(testprice)))
xyplot(testprice,
col = c(brewer.pal(n=8,"Dark2")[1:6],brewer.pal(n=9,"PuBu")[5:9]),
screens = colnames(testprice),
lwd = 3,
strip = FALSE, strip.left = TRUE,
scales = list(x = list(tck = c(1,0), alternating = FALSE),
y = list(tck = c(0,1), draw = TRUE, relation = "same", alternating = 2)),
panel = function(x, y, ...) {
panel.refline(h = pretty(coredata(testprice)), col = "gray70", lty = 3)
panel.xyplot(x, y, ...)
},
main = "Performance Since 1996 or Inception")
#first horizonplot with little adjustment
horizonplot(testprice, horizonscale = 1,
#turn off ticks on top and do not draw y ticks or axis
scales = list(tck = c(1,0), y = list(draw = FALSE,relation = "same")),
#draw strip on top
strip=TRUE,
#do not draw strip to left since we have strip = TRUE above
strip.left=FALSE,
#do standard horizon but also add horizontal white grid lines
panel = function(x, ...) {
panel.horizonplot(x, ...)
#here we draw white horizontal grid
#h = 3 means 3 lines so will divide into fourths
#v = 0 will not draw any vertical grid lines
panel.grid(h=3, v=0,col = "white", lwd=1,lty = 1)
},
layout=c(1,ncol(testprice)),
main = "Performance Since 1996 or Inception")
## amended from horizonplot example given in documentation
horizonplot(testprice,
scales = list(tck = c(1,0), y = list(draw = FALSE,relation = "same")),
origin = 0,
horizonscale = 1,
colorkey = FALSE,
panel = function(x, ...) {
panel.horizonplot(x, ...)
panel.grid(h=3, v=0,col = "white", lwd=1,lty = 3)
},
ylab = list(rev(colnames(testprice)), rot = 0, cex = 0.8, pos = 3),
xlab = NULL,
par.settings=theEconomist.theme(box = "gray70"),
strip.left = FALSE,
layout = c(1,ncol(testprice)),
main = "Performance Since 1996 or Inception")
#horizon plot version of http://www.mebanefaber.com/timing-model/
#do horizon of percent above or below 10 month or 200 day moving average
tckrs <- c("VTI","VEU","IEF","VNQ","DBC")
getSymbols(tckrs, from = "2010-12-31")
#do horizon of percent above or below 10 month or 200 day moving average
prices <- get(tckrs[1])[,4]
for (i in 2:length(tckrs)) {
prices <- merge(prices,get(tckrs[i])[,4])
}
colnames(prices) <- tckrs
n=200
#get percent above or below
pctdiff <- (prices / apply(prices, MARGIN = 2, FUN = runMean, n = n) - 1)[n:NROW(prices),]
horizonplot(pctdiff,
scales = list(tck = c(1,0), y = list(draw = FALSE,relation = "same")),
origin = 0,
horizonscale = 0.05,
colorkey = FALSE,
panel = function(x, ...) {
panel.horizonplot(x, ...)
panel.grid(h=3, v=0,col = "white", lwd=1,lty = 3)
},
ylab = list(rev(colnames(prices)), rot = 0, cex = 0.8, pos = 3),
xlab = NULL,
par.settings=theEconomist.theme(box = "gray70"),
strip.left = FALSE,
layout = c(1,ncol(prices)),
main = "Percent Above or Below 200 Days Moving Average")
# for one more example, let's do a mirror horizon plot
horizonplot.offset <- function(x,horizon.type="offset",horizonscale=0.05,title=NA,alpha=0.4){
#get the positive and negative plots for the offset chart
#very similar to the mirror chart above
#except the negative values will be moved to the top of y range
#and inverted
ppos<-
xyplot(x,ylim=c(0,horizonscale),origin=0,
par.settings=theEconomist.theme(box="transparent"),
lattice.options=theEconomist.opts(),
xlab=NULL,ylab=NULL,
panel = function(x,y,...){
#
for (i in 0:round(max(y)/horizonscale,0))
panel.xyarea(x,y=ifelse(y>0,y,NA)-(horizonscale * i),col="green",border="green",col.line="green",alpha=alpha,lwd=2,
scales = list(y=list(draw=FALSE)),...)
},
main=title)
pneg <-
xyplot(x,ylim=c(0,horizonscale),origin=horizonscale,
panel=function(x, y ,...){
for (i in 0:round(min(y)/-horizonscale,0)) {
panel.xyarea(x,y=horizonscale+ifelse(y<0,y,NA)+(horizonscale*i),col.line="red",border="red",col="red",lwd=2,alpha=alpha,...)
}
})
return(ppos+pneg)
}
horizonplot.offset(pctdiff, title = "Percent Difference from 200 Day Moving Average")

Friday, July 27, 2012

Hi R and Axys, I’m d3.js “Nice to Meet You” (On the Iphone)

I am still definitely in the proof of concept stage, but as I progress I get more excited about the prospects of combining d3.js with R and Axys through Bryan Lewis’ really nice R websockets package (even nicer now that he has added the daemonize function).  In this iteration, I will add a cumulative growth line chart, some animation and transitions, and then javascript will ask R to calculate drawdowns.  Instead of R returning a chart like last time, R will send the results of the drawdown calculations through JSON through the websocket.  We will then use d3.js to draw a line chart of drawdowns.  This becomes really powerful when we consider the existing and thanks to Google Summer of Code soon to be added risk/return calculations offered by the R PerformanceAnalytics package.

As one last bonus at the end of the video you will see that we can get this interactive reporting experience and websocket communication all on our Iphone and Ipad.

As a quick review of what is happening

  1. Axys runs a report called perhstsp.rep.
  2. Axys calls the cdataset.xlsm testd3axys macro and sends performance information.
  3. Excel cdataset.xlsm testd3axys macro does some very basic formatting, converts the data to JSON, creates a webpage, and opens the webpage in the default browser.
  4. Browser opens the webpage and d3.js generates a bar graph of performance and then a cumulative growth line chart.
  5. Browser provides a button to open a websocket with R and send the performance information originally calculated in Axys.
  6. R receives the performance data through the websocket, calculates drawdown, and then sends the drawdown calculations as JSON back to the browser.
  7. Browser receives the drawdown calculations and d3.js plots them as a line chart.

 

The next set of iterations will focus on cleaning up the d3.js charts and adding interactivity.

So far I have received no comments.  Please let me know what you think about this.

The list of acknowlegements is starting to get long. I really appreciate all the fine work done by Mike Bostock on d3.js https://github.com/mbostock/d3/wiki, the dedicated authors of the R package PerformanceAnalytics http://cran.r-project.org/web/packages/PerformanceAnalytics/index.html, Bryan the author of http://illposed.net/websockets.html and the example, the author of RJSONIO http://cran.r-project.org/web/packages/RJSONIO/index.html, and Bruce McPherson at http://excelramblings.blogspot.com/ for the inspirational idea.

To work through on your own, you will need the Excel file cdataset.xlsm, the Axys report perhstsp.rep, and the R code from GIST.

#I deserve no credit for this code; most comes from the example provided at http://bigcomputing.com/PerformanceAnalytics.R
# R/websockets example
if(any(is.na(packageDescription('caTools'))))
stop("This demo requires the caTools package.\nRun install.packages('caTools') to install it.\n\n")
if(any(is.na(packageDescription('PerformanceAnalytics'))))
stop("This demo requires the PerformanceAnalytics package.\nRun install.packages('PerformanceAnalytics') to install it.\n\n")
library('websockets')
library('caTools')
library('quantmod')
library('PerformanceAnalytics')
require('RJSONIO')
#w = createContext(webpage=static_text_service(htmldata))
w = createContext()
# Set up an established (initialization) callback
g = function(DATA, WS, ...)
{
websocket_write("hello",WS)
print("established connection")
}
setCallback("established",g, w)
# Set up a receive callback
f = function(DATA, WS, ...)
{
#get data sent from websocket
d = tryCatch(rawToChar(DATA),error=function(e) "")
#convert JSON message to data.frame and eventually xts
perf <- fromJSON(d)
perf.df<- as.data.frame(matrix(unlist(fromJSON(d)),ncol=3,byrow=TRUE),stringsAsFactors=TRUE)
perf.df[,2:3] <- apply(perf.df[,2:3],MARGIN=2,as.numeric)/100
#name columns
colnames(perf.df) <- c("date","portfolio","sp500")
#get as xts so convert dates from %m/%d/%Y to %Y-%m-%d
perf.xts <- as.xts(perf.df[,2:NCOL(perf.df)],order.by=as.Date(perf.df[,1],format="%m/%d/%Y"))
drawdown <- Drawdowns(perf.xts)
drawdown.matrix <- as.matrix(cbind(index(drawdown),coredata(drawdown)))
colnames(drawdown.matrix) <- c("date",colnames(drawdown))
drawdown.matrix[,"date"] <- format(as.Date(drawdown.matrix[,"date"]),"%m/%d/%Y")
#this will send the dataframe so an object of arrays
#websocket_write(toJSON(df),WS);
#try with matrix which sends an array of objects (d3 prefers and handles better)
websocket_write(toJSON(drawdown.matrix),WS)
#websocket_write("got your data; thanks",WS)
}
setCallback("receive",f, w)
#daemonize(w)
cat("\nThe web service will run until <CTRL>+C is pressed.\n")
cat("Open your local web browser to http://localhost:7681\n")
while(TRUE) {
service(w, timeout=1000L)
#old service(w)
#old Sys.sleep(0.05)
}
rm(w)
gc()

Wednesday, July 25, 2012

Inspirational Stack Overflow Dendrogram Applied to Currencies

When I saw the answer to this Stack Overflow question, I immediately remembered working on my old post Clustering with Currencies and Fidelity Funds and just had to try to apply this technique.  As I should have guessed, it worked with only a minimal amount of changes.  Hoping to incrementally improve, I added a couple of slight modifications.

From TimelyPortfolio

R code from GIST (select raw to copy/paste):

require(quantmod)
require(fAssets)
#get asian currency data from the FED FRED data series
getSymbols("DEXKOUS",src="FRED") #load Korea
getSymbols("DEXMAUS",src="FRED") #load Malaysia
getSymbols("DEXSIUS",src="FRED") #load Singapore
getSymbols("DEXTAUS",src="FRED") #load Taiwan
getSymbols("DEXCHUS",src="FRED") #load China
getSymbols("DEXJPUS",src="FRED") #load Japan
getSymbols("DEXTHUS",src="FRED") #load Thailand
getSymbols("DEXBZUS",src="FRED") #load Brazil
getSymbols("DEXMXUS",src="FRED") #load Mexico
getSymbols("DEXINUS",src="FRED") #load India
getSymbols("DTWEXO",src="FRED") #load US Dollar Other Trading Partners
getSymbols("DTWEXB",src="FRED") #load US Dollar Broad
currencies<-merge(DEXKOUS,DEXMAUS,DEXSIUS,DEXTAUS,DEXCHUS,DEXJPUS,DEXTHUS,DEXBZUS,DEXMXUS,DEXINUS,DTWEXO,DTWEXB)
currencies<-na.omit(currencies)
currencies<-currencies/lag(currencies)-1
# try to do http://stackoverflow.com/questions/9747426/how-can-i-produce-plots-like-this
# Sample data
n <- NROW(currencies)
k <- NCOL(currencies)
d <- as.matrix(na.omit(currencies))
x <- apply(d+1,2,cumprod)
t <- assetsDendrogramPlot(as.timeSeries(currencies))
r <- t$hclust
# Plot
op <- par(mar=c(0,0,0,0),oma=c(0,2,0,0))
# set up plot area for the dendrogram
plot(NA,ylim=c(.5,k+.5), xlim=c(0,4),axes=FALSE)
# Dendogram. See ?hclust for details.
xc <- yc <- rep(NA,k)
o <- 1:k
o[r$order] <- 1:k
#separate into 4 groups for color classification
groups <- cutree(r, k=4)[r$order]
# loop through each to generate the dendrogram
# go from innermost to outermost
for(i in 1:(k-1)) {
a <- r$merge[i,1]
x1 <- if( a<0 ) o[-a] else xc[a]
y1 <- if( a<0 ) 0 else yc[a]
b <- r$merge[i,2]
x2 <- if( b<0 ) o[-b] else xc[b]
y2 <- if( b<0 ) 0 else yc[b]
#do the lines for the dendrogram
lines(
3+c(y1,i,i,y2)/k,
c(x1,x1,x2,x2),
lwd=k-i,
col=groups[colnames(d)[abs(a)]]
)
xc[i] <- (x1+x2)/2
yc[i] <- i
}
# Time series
axis(2,1:k,colnames(d)[r$order],las=0, cex.axis=0.6, line=-1, lwd=0, lwd.ticks=1)
u <- par()$usr
for(i in 1:k) {
f <- c(0,3,i-.5,i+.5)
f <- c(
(f[1]-u[1])/(u[2]-u[1]),
(f[2]-u[1])/(u[2]-u[1]),
(f[3]-u[3])/(u[4]-u[3]),
(f[4]-u[3])/(u[4]-u[3])
)
par(new=TRUE,fig=f)
plot(x[,r$order[i]],axes=FALSE,xlab="",ylab="",main="",type="l",col=groups[i],lwd=2)
box()
}
par(op)
view raw currencyplot.r hosted with ❤ by GitHub

Tuesday, July 24, 2012

The Failure of Asset Allocation - Bonds Are An Imperfect Hedge

US investors were spoiled by US Treasuries which acted as a near perfect hedge to stocks during the 2008-2009 crisis.  However, in real crisis, bonds rarely offer any comfort, and asset allocation fails (see post Death Spiral of a Country and IMF paper Systemic Banking Crises Database: An Update; by Luc Laeven ... – IMF).  As a very timely example, we can examine Spain, which is not even to crisis level yet.

From TimelyPortfolio

In Spain, there is nowhere to hide, and allocation offers no comfort.

R code in Gist (click raw to copy/paste):

#analyze asset allocation experience in Spain
require(lattice)
require(latticeExtra)
require(reshape2)
require(directlabels)
require(quantmod)
require(PerformanceAnalytics)
require(RQuantLib)
data <- read.csv("spain stocks and bond from bloomberg.csv",stringsAsFactors=FALSE)
spainstock <- na.omit(as.xts(as.numeric(data[2:NROW(data),2]),order.by=as.Date(data[2:NROW(data),1],"%m/%d/%Y")))
colnames(spainstock) <- "SpainStocks.IBEX"
spainbond <- na.omit(as.xts(as.numeric(data[2:NROW(data),5]),order.by=as.Date(data[2:NROW(data),4],"%m/%d/%Y")))
colnames(spainbond) <- "SpainBonds.10y"
spainbondpricereturn<-spainbond
spainbondpricereturn[1,1]<-0
colnames(spainbondpricereturn)<-"SpainBond.10y.Price"
#use quantlib to price the Spanish bonds from yields
#these are 10 year bonds so will advance date by 10 years
#we can just use US/GovtBond calendar
for (i in 1:(NROW(spainbond)-1)) {
spainbondpricereturn[i+1,1]<-FixedRateBondPriceByYield(yield=spainbond[i+1,1]/100,issueDate=Sys.Date(),
maturityDate= advance("UnitedStates/GovernmentBond", Sys.Date(), 10, 3),
rates=spainbond[i,1]/100,period=2)[1]/100-1
}
#merge returns
spain.return <- na.omit(merge(spainbondpricereturn,ROC(spainstock,type="discrete",n=1)))
#get drawdowns
spain.drawdown <- Drawdowns(spain.return)
#get in melted data.frame for lattice
spain.drawdown.df <- as.data.frame(cbind(index(spain.drawdown),coredata(spain.drawdown)))
spain.drawdown.melt <- melt(spain.drawdown.df,id.vars=1)
colnames(spain.drawdown.melt) <- c("date","series","drawdown")
spain.drawdown.melt[,"date"] <- as.Date(spain.drawdown.melt[,"date"])
#plot drawdowns
direct.label(asTheEconomist(
xyplot(drawdown~date,groups=series,data=spain.drawdown.melt,
main="Spain - Drawdown of Stocks and Bonds (source: Bloomberg)")),
list("last.points",hjust=1,vjust=0,cex=1.2))

Thursday, July 19, 2012

Best of Axys, R, d3.js, and HTML5

Axys, R, d3.js, and HTML5 all offer incredibly powerful tools for investment management and reporting, but they are not set up to synergistically interact to fill each other’s gaps and leverage each other’s strengths.  In my ideal scenario, Axys serves as the accounting system and performance calculator, R serves as the advanced financial/statistical engine, d3.js serves as the interactive reporting component, and HTML5 offers the user interface and ties everything together through websockets (nicely demoed here).  After working and suffering with Axys for 12 years, I am amazed that it all seems to be coming together.  I provided a bare proof of concept for Axys to d3.js in my post Axys to d3.js Error Catching and Formatting.  Now let’s extend that to R and websockets through the generously contributed R websockets package.  I have borrowed very heavily from the author's Youtube example presented in

In this proof of concept, Axys will calculate performance and send to Excel through a graph macro which creates JSON and an html page (thanks Bruce http://excelramblings.blogspot.com/).  The html page contains javascript and d3.js to produce a simple bar chart.  Now we add a button to take the JSON created by Excel and embedded in our html and send it to R.  R will produce a charts.PerformanceSummary chart and send it back as jpeg to the html page.  The html page will receive the image and replace the d3.js bar chart with the image.

Going forward I will only use R as the statistical engine and continue to rely on d3.js for the interactive reporting.  How far I go with this depends heavily on user response.  Please let me know if you would like me to continue down this path.

The list of acknowlegements is starting to get long.  I really appreciate all the fine work done by Mike Bostock on d3.js https://github.com/mbostock/d3/wiki, the dedicated authors of the R package PerformanceAnalytics http://cran.r-project.org/web/packages/PerformanceAnalytics/index.html, Brian the author of http://illposed.net/websockets.html and the example, the author of RJSONIO http://cran.r-project.org/web/packages/RJSONIO/index.html, and Bruce McPherson at http://excelramblings.blogspot.com/ for the inspirational idea.

To work through on your own, you will need the Excel file cdataset.xlsm, the Axys report perhstsp.rep, and the R code from GIST.

#borrowed significant and substantial portions of code from http://bigcomputing.com/PerformanceAnalytics.R
#see youtube http://www.youtube.com/embed/0iR8Fo0jwW8 for the demo
# R/websockets example
if(any(is.na(packageDescription('caTools'))))
stop("This demo requires the caTools package.\nRun install.packages('caTools') to install it.\n\n")
if(any(is.na(packageDescription('PerformanceAnalytics'))))
stop("This demo requires the PerformanceAnalytics package.\nRun install.packages('PerformanceAnalytics') to install it.\n\n")
library('websockets')
library('caTools')
library('quantmod')
library('PerformanceAnalytics')
require('RJSONIO')
#w = createContext(webpage=static_text_service(htmldata))
w = createContext()
# Set up an established (initialization) callback
g = function(DATA, WS, ...)
{
websocket_write("hello",WS)
print("established connection")
}
setCallback("established",g, w)
# Set up a receive callback
f = function(DATA, WS, ...)
{
#get data sent from websocket
d = tryCatch(rawToChar(DATA),error=function(e) "")
#convert JSON message to data.frame and eventually xts
perf <- fromJSON(d)
perf.df<- as.data.frame(matrix(unlist(fromJSON(d)),ncol=3,byrow=TRUE),stringsAsFactors=TRUE)
perf.df[,2:3] <- apply(perf.df[,2:3],MARGIN=2,as.numeric)/100
#name columns
colnames(perf.df) <- c("date","portfolio","sp500")
#get as xts so convert dates from %m/%d/%Y to %Y-%m-%d
perf.xts <- as.xts(perf.df[,2:NCOL(perf.df)],order.by=as.Date(perf.df[,1],format="%m/%d/%Y"))
f = tempfile()
jpeg(file=f, width=850,height=500, quality=100)
devAskNewPage(ask=FALSE)
charts.PerformanceSummary(perf.xts, colorset = rich6equal, lwd = 2, ylog = TRUE)
dev.off()
p <- base64encode(readBin(f,what="raw",n=1e6))
p <- paste("data:image/jpg;base64,\n",p,sep="")
websocket_write(paste(p),WS)
file.remove(f)
#this will send the dataframe so an object of arrays
#websocket_write(toJSON(df),WS);
#try with matrix which sends an array of objects (d3 prefers and handles better)
#websocket_write(toJSON(as.matrix(df)),WS)
#websocket_write("got your data; thanks",WS)
}
setCallback("receive",f, w)
cat("\nThe web service will run until <CTRL>+C is pressed.\n")
cat("Open your local web browser to http://localhost:7681\n")
while(TRUE) {
service(w, timeout=1000L)
#old service(w)
#old Sys.sleep(0.05)
}
rm(w)
gc()

Wednesday, July 18, 2012

Axys to d3.js Error Catching and Formatting

I got so excited in d3.js Chart of Axys Performance that I did not test adequately.  When I added a y-axis, I realized that the axis was reversed.  All the links should now be fixed with the better code, and I also added some labels.  This is a work in progress, so please stay tuned.

To try it for yourself, please download the perhstsp.rep Axys report and a stripped down version of cDataSet.xlsm (see the full version for more http://ramblings.mcpher.com/Home/excelquirks/downloadlist) . If you would just like to see the final html output then go to http://bl.ocks.org/3106445.

Thanks to Mike Bostock http://bost.ocks.org/mike/for all his incredible work on everything, but especially d3.js. Also, thanks again to Bruce for his stimulating work with Excel.

Friday, July 13, 2012

d3.js Chart of Axys Performance

Please see Axys to d3.js Error Catching and Formatting for a much better version.

Way back in January I made some amazing (is it sad that I consider this amazing?) discoveries as I tried to push the limits with Advent Axys reporting and even integrated R with Axys in R in Axys (Impossible Dream).  Now I have accomplished what I believe to be even more exciting and potentially useful by  combining the nice Excel and d3.js work done at http://excelramblings.blogspot.com/ with 20 year old Axys technology (nicely improved by http://viabinary.com/ViaBinary/Home.html).  For those of you who know reporting in Axys, I’m sure you share my excitement.  Please let me know your thoughts.

To prove that I’m not dreaming, here is the screencast.

To try it for yourself, please download the perhstsp.rep Axys report and a stripped down version of cDataSet.xlsm (see the full version for more http://ramblings.mcpher.com/Home/excelquirks/downloadlist) .  If you would just like to see the final html output then go to http://bl.ocks.org/3106445.

Thanks to Mike Bostock http://bost.ocks.org/mike/for all his incredible work on everything, but especially d3.js.  Also, thanks again to Bruce for his stimulating work with Excel.

Thursday, July 5, 2012

More Exploration of Crazy RUT

Unintentionally while playing with the lawstat package in R, I started trying to build systems (STANDARD DISCLAIMER: NOT INVESTMENT ADVICE AND WILL LOSE LOTS OF MONEY SO PROCEED WITH CAUTION) based on the Jarque Bera test of normality (entry in Wikipedia or the research paper http://scholar.google.com/scholar?cluster=18293285759900281575&hl=en&as_sdt=0,1).  Since I was only playing around, I sort of deliberately curve fitted a system to work on the Russell 2000.  After a lot of experimentation, I came up with something very interesting, which led me to test on multiple other indexes.  It seemed to only really work well on the Russell 2000 and mainly over the last 5 years, which of course led me right back to the question posed in Crazy RUT and explored again in Crazy RUT in Academic Context Why Trend is Not Your Friend, which is “Why don’t most momentum systems work over the last decade on the Russell 2000 when they work on almost all other indexes?”  Please let me know if you have a good explanation, or if you want to test for regimes. Here is the result. I apologize that I spent little time on making these pretty.

From TimelyPortfolio

Even more strange is that the system applied to Kenneth French’s small classification shows different results.

From TimelyPortfolio

Here is how the system looks on the S&P 500, and again certainly nothing special.

From TimelyPortfolio

R code from GIST (do raw for copy/paste):

require(lawstat)
require(quantmod)
require(PerformanceAnalytics)
#set up function to use to obtain rolling p-value from bera jarque normality test
rjb.p <- function(x) {
rjb.test(x,option="RJB")$p.value
# rjb.test(x,option="RJB")$statistic
}
#set up function to do same routine on Russell 2000 and French small
#to test for robustness of result
testmultiple <- function(x,data.source="") {
if (data.source=="") data.source=colnames(x)[1]
x.roc <- ROC(to.weekly(x)[,4],type="discrete",n=1)
x.roc[1,]<-0
signal.rjb <- apply.rolling(x.roc,FUN=rjb.p,width=25)
roc.long <- ROC(to.weekly(x)[,4],type="discrete",n=8)
#plot.zoo(signal.rjb*roc.long)#,ylim=c(0,20))
perf <- merge(lag(ifelse((signal.rjb>0.05 & roc.long > 0.015) | (signal.rjb < 0.75 & signal.rjb > 0.4) ,1,0)) * x.roc,x.roc)
colnames(perf) <- c("system","buyhold")
charts.PerformanceSummary(perf,ylog=TRUE,main=paste("Performance of Systems",data.source,sep=" "))
return("done")
}
#do this to get from Yahoo! Finance but history only goes back to 1987
#getSymbols("^RUT",from="1900-01-01")
#I have local file from Bloomberg that goes back to 1980
RUT <- read.csv("rut.csv",stringsAsFactors=FALSE)
RUT <- as.xts(RUT[,2],order.by=as.Date(RUT[,1]))
testmultiple(RUT,data.source="Russell 2000 Index")
#do again but this time using French data
my.url="http://mba.tuck.dartmouth.edu/pages/faculty/ken.french/ftp/6_Portfolios_ME_Prior_12_2_Daily.zip"
my.tempfile<-paste(tempdir(),"\\frenchmomentum.zip",sep="")
my.usefile<-paste(tempdir(),"\\6_Portfolios_ME_Prior_12_2_Daily.txt",sep="")
download.file(my.url, my.tempfile, method="auto",
quiet = FALSE, mode = "wb",cacheOK = TRUE)
unzip(my.tempfile,exdir=tempdir(),junkpath=TRUE)
#read space delimited text file extracted from zip
french_momentum <- read.table(file=my.usefile,
header = TRUE, sep = "",
as.is = TRUE,
skip = 12, nrows=12316)
colnames(french_momentum) <- c(paste("Small",
colnames(french_momentum)[1:3],sep="."),
paste("Large",colnames(french_momentum)[1:3],sep="."))
#get dates ready for xts index
datestoformat <- rownames(french_momentum)
datestoformat <- paste(substr(datestoformat,1,4),
substr(datestoformat,5,6),substr(datestoformat,7,8),sep="-")
#get xts for analysis
french_momentum_xts <- as.xts(french_momentum[,1:6],
order.by=as.Date(datestoformat))
french_momentum_xts <- french_momentum_xts/100
#get average of small to test all
french_momentum_small <- as.xts(apply(french_momentum_xts[,1:3],MARGIN=1,FUN=mean),
order.by=index(french_momentum_xts))
french_momentum_price <- as.xts(apply(french_momentum_small+1,MARGIN=2,FUN=cumprod),
order.by=index(french_momentum_xts))
#to get cumulative return(price) for each by size x momentum
#french_momentum_price <- as.xts(apply(french_momentum_xts+1,MARGIN=2,FUN=cumprod),
# order.by=index(french_momentum_xts))
#to test each by momentum
#apply(french_momentum_price[,1:3],MARGIN=2,FUN=testmultiple)
testmultiple(french_momentum_price,data.source="French Small")
#now let's do on the S&P 500
getSymbols("^GSPC",from="1900-01-01")
testmultiple(GSPC,data.source="S&P 500")

Monday, July 2, 2012

Graphics Artifacts from Quarterly Commentary

For my Q2 2012 commentary, I tried multiple graphs to illustrate the disconnect of the US stock markets with the rest of the world.  I think I finally settled on this simple Excel bar graph populated by Bloomberg data, but I thought some might like to see some of the R graphical artifacts as I explored how best to illustrate my point.

From TimelyPortfolio

Although I settled on a 1 year performance chart, I really wanted to show more history, but without all the noise of a daily, weekly, or even monthly chart.  I tried a chart connecting the 200 day max/min points, but it still seemed noisy and difficult to follow. One big letdown was Yahoo! Finance not providing DJUBS, E1DOW, or P1DOW any more.

From TimelyPortfolio
From TimelyPortfolio

I then thought connecting the end of year points might offer a nice simplification, and I probably liked this best of the R charts.  This shows enough of the path to see the universal move up to 2010, and then the disconnect.

From TimelyPortfolio
From TimelyPortfolio

Even one R base graphics chart. Thanks again Josh for the fork.

From TimelyPortfolio

As one other option, I thought I would try to just connect beginning, middle, and end since December 2008.  This certainly shows the huge disconnect between the U.S. and the rest of the world, but I found it difficult to describe the methodology within the chart.

From TimelyPortfolio
From TimelyPortfolio

I hope this helps someone somewhere.  As always, I very much enjoy comments and opinions.

R code in GIST (do raw for copy/paste):

require(lattice)
require(latticeExtra)
require(directlabels)
require(ggplot2)
require(reshape2)
require(quantmod)
require(PerformanceAnalytics)
tckrs <- c("SPY","IWM","EWP","EFA","CRB") #will have to get CRB from Systematic Investor
descr <- c("SP500","Russell2000","Spain","EAFE","CRB")
#get equity indexes
getSymbols(tckrs[1:4],from="2008-12-01")
#and since Yahoo no longer provides ^DJUBS historical prices
#use Systematic Investor
con = gzcon(url('http://www.systematicportfolio.com/sit.gz', 'rb'))
source(con)
close(con)
CRB <- get.CRB()["2008-12-01::",]
getmaxmin <- function(prices,n=100) {
dc <- DonchianChannel(prices[,4],n)
maxmin <- rbind(prices[1,4],
prices[which(prices[,4]==dc[,1]),4],
prices[which(prices[,4]==dc[,3]),4],
prices[NROW(prices),4])/as.numeric(prices[1,4])
maxmin <- as.data.frame(cbind(as.Date(index(maxmin)),coredata(maxmin)),stringsAsFactors=FALSE)
colnames(maxmin) <- c("date","price")
return(maxmin)
}
n=200
df <- as.data.frame(cbind(descr[1],getmaxmin(get(tckrs[1]),n)),stringsAsFactors=FALSE)
colnames(df)[1] <- "index"
for (i in 2:length(tckrs)) {
temp <- as.data.frame(cbind(descr[i],getmaxmin(get(tckrs[i]),n)),stringsAsFactors=FALSE)
colnames(temp)[1] <- "index"
df <- rbind(df,temp)
}
direct.label(
xyplot(price~as.Date(date),groups=index,data=df,lwd=3,type="l",main="Path of World Markets Since Dec 2008"),
list("last.qp",hjust=0.35,vjust=-0.25,cex=0.75))
direct.label(
ggplot(aes(y=price,x=as.Date(date)),data=df) + geom_line(aes(colour=index)) + theme_bw() + opts(legend.position = "none") +
#ggplot(aes(y=price,x=as.Date(date)),data=indexes.melt) + geom_smooth(aes(colour=indexes)) + theme_bw() + opts(legend.position = "none") +
opts(panel.grid.minor = theme_blank()) +
opts(axis.line = theme_segment()) +
opts(panel.border = theme_blank()) +
opts(title="Path of World Indexes Since 2008")
,list("last.qp",hjust=0.75,vjust=-0.25,cex=0.75))
#do beginning, middle, and end
getbeginend <- function(prices,middle=TRUE) {
dc <- DonchianChannel(prices[,4],n)
beginend <- rbind(prices[1,4],
prices[NROW(prices)/2,4],
prices[NROW(prices),4])/as.numeric(prices[1,4])
beginend <- as.data.frame(cbind(as.Date(index(beginend)),coredata(beginend)),stringsAsFactors=FALSE)
if(middle==FALSE)
beginend <- beginend[c(1,3),]
colnames(beginend) <- c("date","price")
return(beginend)
}
df <- as.data.frame(cbind(descr[1],getbeginend(get(tckrs[1]),n)),stringsAsFactors=FALSE)
colnames(df)[1] <- "index"
for (i in 2:length(tckrs)) {
temp <- as.data.frame(cbind(descr[i],getbeginend(get(tckrs[i]),n)),stringsAsFactors=FALSE)
colnames(temp)[1] <- "index"
df <- rbind(df,temp)
}
asTheEconomist(direct.label(xyplot(price~as.Date(date),groups=index,data=df,lwd=3,type="l",main="Change Since Dec 2008"),
"last.qp"))
direct.label(
ggplot(aes(y=price,x=as.Date(date)),data=df) + geom_line(aes(colour=index)) + theme_bw() + opts(legend.position = "none") +
#ggplot(aes(y=price,x=as.Date(date)),data=indexes.melt) + geom_smooth(aes(colour=indexes)) + theme_bw() + opts(legend.position = "none") +
opts(panel.grid.minor = theme_blank()) +
opts(axis.line = theme_segment()) +
opts(panel.border = theme_blank()) +
opts(title="Change of Indexes Since 2008")
,list("last.qp",hjust=0.75,vjust=-0.25,cex=0.75))
#do yearly
indexes <- as.data.frame(to.yearly(get(tckrs[1]))[,4])
for (i in 2:length(tckrs)) {
indexes <- cbind(indexes,as.data.frame(to.yearly(get(tckrs[i]))[,4]))
}
colnames(indexes) <- descr
indexes.roc <- ROC(indexes,type="discrete",n=1)
indexes.roc[1,] <- 0
labs <- Return.cumulative(indexes.roc)
#thanks for the fork http://blog.fosstrading.com/
par(mar=c(4,4,4,5))
chart.CumReturns(indexes.roc[,order(labs)],ylab=NA,xlab=NA,colorset=1:5,main="Path of World Indexes Since Dec 2008")
axis(side=4,at=labs,labels=FALSE,las=1,cex.axis=0.75,lwd=0,lwd.ticks=0.5,col.ticks="black",line=-0.25)
mtext(colnames(labs)[order(labs)],cex=0.75, 4, at=labs[order(labs)],las=1,col=1:5,line=1)
indexes.cumul <- apply(indexes.roc+1,MARGIN=2,cumprod)
indexes.melt <- melt(as.data.frame(cbind(as.Date(rownames(indexes.cumul)),indexes.cumul),stringsAsFactors=FALSE),
id.vars=1)
colnames(indexes.melt) <- c("date","indexes","price")
direct.label(asTheEconomist(
xyplot(price~as.Date(date),groups=indexes,data=indexes.melt,type="l",main="Path of World Indexes Since Dec 2008"))
,list("last.qp",hjust=0.35,vjust=0.1,cex=1))
direct.label(
ggplot(aes(y=price,x=as.Date(date)),data=indexes.melt) + geom_line(aes(colour=indexes)) + theme_bw() + opts(legend.position = "none") +
#to smooth use the next line instead
#ggplot(aes(y=price,x=as.Date(date)),data=indexes.melt) + geom_smooth(aes(colour=indexes)) + theme_bw() + opts(legend.position = "none") +
opts(panel.grid.minor = theme_blank()) +
opts(axis.line = theme_segment()) +
opts(panel.border = theme_blank()) +
opts(title="Path of World Indexes Since Dec 2008")
,list("last.qp",hjust=0.75,vjust=-0.25,cex=0.75))