Monday, October 22, 2012

Resurrect Posts on Japan and the Yen

As the Yen and Japan continue to get more interesting in my mind, I just wanted to resurrect some posts that I have done on Japan and the Yen and sort them by my favorites.

Japan Trade by Geographic Region
Japanese Trade and the Yen
Japan Intentional or Accidental Pursuit of Deflation
Japan Trade More Specifically with Korea

Just to add a chart, here is one using data from the Federal Reserve Bank of St. Louis (FRED).  While the extreme correlation between the Yen and the S&P 500 has limited the opportunity available in the Yen, the correlation has recently weakened as Japanese deficits have worsened and the Yen stopped getting stronger.

From TimelyPortfolio

R code:

require(latticeExtra)
require(quantmod)

getSymbols("DEXJPUS",src="FRED")
getSymbols("SP500", src="FRED")

asTheEconomist(xyplot(DEXJPUS,main="US Dollars for Japanese Yen Since 1970\nSource: Federal Reserve Bank of St. Louis"))

#merge the weekly returns of Yen and SP500
ret <- na.omit(merge(weeklyReturn(DEXJPUS),weeklyReturn(SP500)))
#use the rolling correlation method from PerformanceAnalytics chart.RollingCorrelation
rollcor <- as.xts(rollapply(ret, width = 208, FUN = function(x) cor(x[,
                     1, drop = FALSE], x[, 2, drop = FALSE]), by = 1,
                     by.column = FALSE, na.pad = FALSE, align = "right"))
xyplot(na.omit(merge(SP500,rollcor,DEXJPUS)),col=brewer.pal("RdBu",n=9)[c(9,2,8)],
              lattice.options=theEconomist.opts(),
              par.settings=theEconomist.theme(box="transparent"),
              scale=list(y=list(rot=0)),
              xlab=NULL,
              strip=strip.custom(factor.levels=c("S&P 500","Correlation (Rolling 4 Year) S&P 500 and USD/Yen","USD/Japanese Yen")),
              main = "S&P 500 and USD/Yen Since 1970\nSource: Federal Reserve Bank of St. Louis")

Tuesday, October 16, 2012

Japanese Government Bond (JGB) Data Since 1974

The Ministry of Finance Japan very generously provides data on JGBs back to 1974.  Here is a quick example how to pull it into R and then graph it.

From TimelyPortfolio

R code in GIST (do raw for copy/paste):

#get Japan yield data from the Ministry of Finance Japan
#data goes back to 1974
require(latticeExtra)
require(xtsExtra)
url <- "http://www.mof.go.jp/english/jgbs/reference/interest_rate/"
filenames <- paste("jgbcme",c("","_2010","_2000-2009","_1990-1999","_1980-1989","_1974-1979"),".csv",sep="")
#load all data and combine into one jgb data.frame
jgb <- read.csv(paste(url,filenames[1],sep=""),stringsAsFactors=FALSE)
for (i in 2:length(filenames)) {
jgb <- rbind(jgb,read.csv(paste(url,"/historical/",filenames[i],sep=""),stringsAsFactors=FALSE))
}
#now clean up the jgb data.frame to make a jgb xts
jgb.xts <- as.xts(data.matrix(jgb[,2:NCOL(jgb)]),order.by=as.Date(jgb[,1]))
plot.xts(jgb.xts,ylim=c(0,12),screens=1,las=1)
plot.xts(jgb.xts,ylim=c(0,12),screens=c(rep(1,5),rep(2,5),rep(3,5)),las=1)
#use lattice to do the same thing
#for the sake of time will do final formatting here
xyplot(jgb.xts,col=brewer.pal("Blues",n=9)[5:9],
ylim=c(0,12),
screens=c(rep(1,5),rep(2,5),rep(3,5)),
lattice.options=theEconomist.opts(),
par.settings=theEconomist.theme(box="transparent"),
scale=list(y=list(rot=0)),
strip=strip.custom(factor.levels=c("1-5 Year","5-10 Year","10-40 Year"),style=5),
main="Japanese Government Bonds Since 1974")
view raw jgb data.r hosted with ❤ by GitHub

Life on the Big International Frontier

Although I have used the Kenneth French data library extensively in various posts, I have not yet used the international data sets paired with the wonderful paper.

Eugene F. Fama and Kenneth R. French (2012) "Size, Value, and Momentum in International Stock Returns", Critical Finance Review

To rectify this home bias, let’s generate some efficient frontiers for the biggest cap stocks by geographic region to see how the frontiers have evolved over the last 20 years.

From TimelyPortfolio

Eventually, I would like to think through some other methods of comparing risk, return, and weights across multiple frontiers.

R code from GIST (do raw for copy/paste):

loadfrench <- function(zipfile, txtfile, skip, nrows) {
require(xts)
#my.url will be the location of the zip file with the data
my.url=paste("http://mba.tuck.dartmouth.edu/pages/faculty/ken.french/ftp/",zipfile,".zip",sep="")
#this will be the temp file set up for the zip file
my.tempfile<-paste(tempdir(),"\\frenchzip.zip",sep="")
#my.usefile is the name of the txt file with the data
my.usefile<-paste(tempdir(),"\\",txtfile,".txt",sep="")
download.file(my.url, my.tempfile, method="auto",
quiet = FALSE, mode = "wb",cacheOK = TRUE)
unzip(my.tempfile,exdir=tempdir(),junkpath=TRUE)
#read space delimited text file extracted from zip
french <- read.table(file=my.usefile,
header = TRUE, sep = "",
as.is = TRUE,
skip = skip, nrows=nrows)
#get dates ready for xts index
datestoformat <- rownames(french)
datestoformat <- paste(substr(datestoformat,1,4),
substr(datestoformat,5,6),"01",sep="-")
#get xts for analysis
french_xts <- as.xts(french[,1:NCOL(french)],
order.by=as.Date(datestoformat))
#divide by 100 to get percent
french_xts <- french_xts/100
#delete missing data which is denoted by -0.9999
french_xts[which(french_xts < -0.99,arr.ind=TRUE)[,1],
unique(which(french_xts < -0.99,arr.ind=TRUE)[,2])] <- 0
return(french_xts)
}
filenames <- c("Global_25_Portfolios_ME_BE-ME","Europe_25_Portfolios_ME_BE-ME","Japan_25_Portfolios_ME_BE-ME","Asia_Pacific_ex_Japan_25_Portfolios_ME_BE-ME","North_America_25_Portfolios_ME_BE-ME")
#loop through the filenames to load the file for each region
for (i in 1:length(filenames)) {
assign(substr(filenames[i],1,4), loadfrench(zipfile=filenames[i],txtfile=filenames[i],skip=21,nrows=266))
}
#merge the data into one xts object for ease of reference and use
big <- get(substr(filenames[1],1,4))[,21:25]
colnames(big) <- paste(substr(filenames[1],1,4),".",c("expensive",2:4,"cheap"),sep="")
#also set up equal weight to just explore the regions bigcap without valuation
big.ew <- as.xts(apply(big,MARGIN=1,FUN=mean),order.by=index(big))
colnames(big.ew) <- substr(filenames[1],1,4)
for (i in 2:length(filenames)) {
temp <- get(substr(filenames[i],1,4))[,21:25]
colnames(temp) <- paste(substr(filenames[i],1,4),".",c("expensive",2:4,"cheap"),sep="")
big <- merge(big,temp)
temp.ew <- as.xts(apply(temp,MARGIN=1,FUN=mean),order.by=index(temp))
colnames(temp.ew) <- substr(filenames[i],1,4)
big.ew <- merge(big.ew,temp.ew)
}
#use the equal weighted big cap
portfolio <- big.ew #change to big if you want to see the full 5x5
require(fPortfolio)
#do a frontier plot full series and then 1990-1999 and 2000-current
#sloppy but it will work
frontier <- list(portfolioFrontier(as.timeSeries(portfolio["::1999",])),
portfolioFrontier(as.timeSeries(portfolio["2000::",])),
portfolioFrontier(as.timeSeries(portfolio)))
datelabels<-c("1990-1999","2000-2012","1990-2012")
#get colors with topo.colors for the three frontiers
#we will use the first 3 of the 4 supplied
colors <- topo.colors(4)[3:1]
for(i in 1:3) {
frontierPlot(frontier[[i]], pch=19, xlim=c(0,0.10), ylim=c(0,0.015), title=FALSE, col=c(colors[i],colors[i]), add=as.logical(i-1))
minvariancePoints(frontier[[i]],pch=19,col="red")
#tangencyPoints(frontier,pch=19,col="blue")
#tangencyLines(frontier,pch=19,col="blue")
#equalWeightsPoints(frontier[[i]],pch=15,col="grey")
singleAssetPoints(frontier[[i]],pch=19,cex=1,col=colors[i])
#twoAssetsLines(frontier,lty=3,col="grey")
#sharpeRatioLines(frontier,col="orange",lwd=2)
#legend("topleft",legend=colnames(portfolio),pch=19,col=topo.colors(10),
# cex=0.65)
#label assets
stats <- getStatistics(frontier[[i]])
text(y=stats$mean,x=sqrt(diag(stats$Cov)),labels=names(stats$mean),pos=4,col=colors[i],cex=0.7)
#set up function from equalWeightsPoints to also label the point
equalLabel <- function (object, return = c("mean", "mu"), risk = c("Cov", "Sigma",
"CVaR", "VaR"), auto = TRUE, ...)
{
return = match.arg(return)
risk = match.arg(risk)
data = getSeries(object)
spec = getSpec(object)
constraints = getConstraints(object)
numberOfAssets = getNAssets(object)
setWeights(spec) = rep(1/numberOfAssets, times = numberOfAssets)
ewPortfolio = feasiblePortfolio(data, spec, constraints)
assets = frontierPoints(ewPortfolio, return = return, risk = risk,
auto = auto)
text(assets, labels = "Equal-Weight", pos=4,...)
invisible(assets)
}
#equalLabel(frontier,cex=0.7,col="grey")
#label the frontier dates at minvariance point; again very sloppy but it works
#text(x=min(frontierPoints(frontier[[i]])[,1]),
# y=frontierPoints(frontier[[i]])[which(frontierPoints(frontier[[i]])[,1]==min(frontierPoints(frontier[[i]])[,1]))[1],2],
# labels=datelabels[i],col=colors[i],pos=2)
text(x=(minvariancePoints(frontier[[i]])[,1]),
y=(minvariancePoints(frontier[[i]])[,2]),
labels=datelabels[i],col=colors[i],pos=2)
}
title(main="Global Biggest Cap Efficient Frontier",xlab="Risk(cov)",ylab="Monthly Return")
mtext(side=3, text="source: http://mba.tuck.dartmouth.edu/pages/faculty/ken.french/data_library.html",font=3,cex=0.8)
#also parallel coordinates of each of the minvariance might be interesting
minvar <- as.data.frame(rbind((minvariancePoints(frontier[[1]])),(minvariancePoints(frontier[[2]])),(minvariancePoints(frontier[[3]]))))
rownames(minvar) <- datelabels
parcoord(minvar,col=colors)
#might be nice to do animated gif or parallel coordinates of weights or risk/return
weightsPlot(frontier[[3]])(frontier[[3]]))

Monday, October 15, 2012

Not Much of a Grand Finale. What if We Go To 0?

When I ask the question “What if the US 10 year goes to 0?", most do not know the effect, the catalyst, or if 0 has ever happened before.  The math is fairly simple to do in Excel or with an old-school calculator, but let’s use RQuantLib to do the pricing and then use LatticeExtra with some slight adjustment in SVG.  RQuantLib spits out a total return of 17% if we go to 0 by the end of October, which seems like a decent amount until we look at on a chart.

From TimelyPortfolio

Mildly impressive, but the move is almost undetectable on a log scale.

From TimelyPortfolio

Throughout history, we have really only one good reference point in Japan whose 10 year went very briefly to 0.47%, but we need to remember that was in extended deflation in which stocks and real estate lost 90%.  That 17% return if we go to 0 (actually much less since 0.47% 0.43% was the stopping point) is not all that helpful in this devastating environment.

Even more strange is that the move we experienced over the last 15 months is greater than the potential move from here to 0.  On a six month change in yield chart, the 2% from April to 0% in October seems perfectly normal if we forget about the starting point.

From TimelyPortfolio

Similarly, a 12 month rolling total return chart does not reveal anything odd.

From TimelyPortfolio

However, starting point is critical.  Instead of subtracting ending from starting yield, ending yield/starting yield is more appropriate at this critical level.  Now we can see how unusual the move really is.

From TimelyPortfolio

If you are buying bonds to protect/benefit from a disastrous, deflationary “end of the world”, please be aware that best case you make a fairly measly 17%.  Just moving back to where we were Spring 2011 would mean a bigger loss than the absolute best case.

THIS IS NOT INVESTMENT ADVICE.  ALL OF THE ABOVE IS SIMPLY FOR ILLUSTRATION.

R code from GIST: (do raw for copy/paste)

require(quantmod)
require(PerformanceAnalytics)
require(RQuantLib)
require(latticeExtra)
getSymbols("DGS10",src="FRED")
GS10 <- to.monthly(DGS10)[,4]
getSymbols("GS10",src="FRED")
#Fed monthly series of yields is the monthly average of daily yields
#set index to yyyy-mm-dd format rather than to.monthly mmm yyy for better merging later
index(GS10)<-as.Date(index(GS10))
#add next month as 0%
GS10 <- rbind(GS10,as.xts(0,order.by=as.Date("2012-10-01")))
GS10pricereturn<-GS10
GS10pricereturn[1,1]<-0
colnames(GS10pricereturn)<-"PriceReturn-monthly avg GS10"
#use quantlib to price the GS10 and BAA bonds from monthly yields
#GS10 and BAA series are 20-30 year bonds so will advance date by 25 years
for (i in 1:(NROW(GS10)-1)) {
GS10pricereturn[i+1,1]<-FixedRateBondPriceByYield(yield=GS10[i+1,1]/100,issueDate=Sys.Date(),
maturityDate= advance("UnitedStates/GovernmentBond", Sys.Date(), 10, 3),
rates=GS10[i,1]/100,period=2)[1]/100-1
}
#total return will be the price return + yield/12 for one month
GS10totalreturn<-GS10pricereturn+lag(GS10,k=1)/12/100
colnames(GS10totalreturn)<-"TotalReturn-monthly avg GS10"
GS10totalreturn[1,1] <- 0
GS10cumul <- cumprod(GS10totalreturn+1)
asTheEconomist(xyplot(GS10cumul,scales=list(y=list(rot=0)),
main="US 10 Year Cumulative Growth since 1954"))
asTheEconomist(xyplot(log(GS10cumul),
scales=list(y=list(rot=0)),
main="US 10 Year Cumulative Growth (log scale) since 1954"))
asTheEconomist(xyplot(ROC(GS10cumul,12, type = "discrete"),
scales=list(y=list(rot=0)),
main="US 10 Year 12 Month Total Return since 1954"))
roc <- ROC(GS10,12,type="discrete")
asTheEconomist(xyplot(roc["::2012-09-01"],
scales=list(y=list(rot=0)),
main="US 10 Year 12 Month % Change in Yield since 1954"))
roc["2012-10-01"] <- -1
asTheEconomist(xyplot(roc,
scales=list(y=list(rot=0)),
main="US 10 Year 12 Month % Change in Yield (with 0% for October 2012) since 1954"))
plot.zoo(GS10)
asTheEconomist(xyplot(diff(GS10,6),
scales=list(y=list(rot=0)),
main="US 10 Year 6 Month Change in Yield since 1954"))

Tuesday, October 2, 2012

Emerging as Low Vol

Extending the series begun with When Russell 2000 is Low Vol, I thought I should take a look at Emerging Market stocks during periods of low relative volatility to the S&P 500.  So you can replicate even without access to expensive data, let’s use the Vanguard Emerging Market Fund (VEIEX) and the Vanguard S&P 500 Fund (VFINX) as proxies.  In the 12 month rolling regression, we see the same fairly steadily increasing beta and correlation of the Emerging Market stocks to the S&P 500 that we saw in the Russell 2000.

From TimelyPortfolio

If I progress further on this research, I will have to work on an adaptive definition of “low vol”, but for the purpose of this post, I defined “low vol” as

Emerging 50 day std. dev – S&P 500 50 day sd > –0.075

For the Russell 2000, we used a more strict 0.0125.  Although the numeric definition is different, the chart shows a very similar profile.

From TimelyPortfolio

R code from GIST:

require(quantmod)
require(PerformanceAnalytics)
getSymbols("VEIEX",from = "1900-01-01") #use VEIEX Vanguard Emerging as proxy for emerging mkt stocks
getSymbols("VFINX",from = "1900-01-01") #use VFINX Vanguard SP500 as proxy for SP500
#get 1 day change for the emerging and sp500
roc <- na.omit(merge(ROC(VEIEX[,6],type="discrete",n=1),ROC(VFINX[,6],type="discrete",n=1)))
stdev <- rollapplyr(roc,FUN=sd,width=50)
#get relative strength of emerging versus S&P 500
rs <- VEIEX[,6]/VFINX[,6]
#do some trial graphs to see the interaction
plot.zoo(merge(stdev[,1]-stdev[,2],rs))
plot.zoo(merge(stdev[,1]-stdev[,2]/rs,rs))
plot.zoo(merge(stdev[,1]/stdev[,2],rs,VEIEX[,6]))
#create a PerformanceAnalytics rolling summary of emerging versus the S&P 500
charts.RollingRegression(roc[,1],roc[,2],width=250,main="")
title(main="Vanguard Emerging compared to the Vanguard S&P 500 (Rolling 1 Year)",outer=TRUE, line=-1.5, adj=0.05, cex.main=0.85)
#use colors provided in xblocks documentation
rgb <- hcl(c(0, 0, 260), c = c(100, 0, 100), l = c(50, 90, 50), alpha = 0.2)
plot.zoo(VEIEX[,6], #plot closing price of emerging
bty="n", #no box; will fill in later with abline
las=1, #no rotation on y axis labels
xlab = NA,
ylab = NA)
xblocks(index(VEIEX[,6]), as.vector(stdev[,1]-stdev[,2]/rs > - 0.075),col = rgb[3]) #admittedly the -0.075 is ex-post
#connect the axes
abline(h=par("usr")[3]) #extend y axis
abline(v=par("usr")[1]) #extend x axis
abline(h=pretty(par("yaxp")),lty=1,lwd=2,col="white") #try something new for gridlines
title(main="Vanguard Emerging VEIEX (source: Yahoo! Finance)",outer=TRUE, line=-2, adj=0.05)
mtext("blocks denote periods where Emerging 50 day sd low compared to S&P 500 sd",side=3,adj=0.05,cex=0.7,font=3, line=-1.5)

Monday, October 1, 2012

When Russell 2000 is Low Vol

Continuing in my exploration of the Russell 2000 (Russell 2000 Softail Fat Boy), I thought I would try to approach the topic with a low volatility paradox mindset.  Since 2005, beta of the Russell 2000 compared to the S&P 500 has exceeded 1.2 with a max of 1.6 for almost every rolling 1 year period.  This suggests that the Russell 2000 is anything but low vol.

From TimelyPortfolio

However, we can take a more simplistic view by comparing the rolling 50-day standard deviation of the Russell 2000 with the S&P 500.  Russell 2000 on an absolute and relative basis does very well when rolling 50-day standard deviation of the Russell 2000 minus the same standard deviation on the S&P 500 exceeds –1.25%, so the Russell 2000 performs best when volatility approaches the S&P 500.  In low relative volatility environments, it seems we should own the high beta Russell 2000.  You will see the largest down moves all occur in the non-shaded time periods.

From TimelyPortfolio

I intentionally wanted this post to be simple, so I hid a lot of the preliminary work and extra links.  Far more went into this than appears above.

R code from GIST:

require(quantmod)
require(PerformanceAnalytics)
getSymbols("^RUT",from = "1900-01-01")
getSymbols("^GSPC",from = "1900-01-01")
#get 1 day change for the Russell 2000 and S&P 500
roc <- na.omit(merge(ROC(RUT[,4],type="discrete",n=1),ROC(GSPC[,4],type="discrete",n=1)))
stdev <- rollapplyr(roc,FUN=sd,width=50)
#get relative strength of Russell 2000 versus S&P 500
rs <- RUT[,4]/GSPC[,4]
#do some trial graphs to see the interaction
plot.zoo(merge(stdev[,1]-stdev[,2],rs))
plot.zoo(merge(stdev[,1]-stdev[,2]/rs,rs))
plot.zoo(merge(stdev[,1]/stdev[,2],rs,RUT[,4]))
#create a PerformanceAnalytics rolling summary of Russell 2000 versus the S&P 500
charts.RollingRegression(roc[,1],roc[,2],width=250,main="")
title(main="Russell 2000 compared to the S&P 500 (Rolling 1 Year)",outer=TRUE, line=-1.5, adj=0.05)
#use colors provided in xblocks documentation
rgb <- hcl(c(0, 0, 260), c = c(100, 0, 100), l = c(50, 90, 50), alpha = 0.2)
plot.zoo(RUT[,4], #plot closing price of Russell 2000
bty="n", #no box; will fill in later with abline
las=1, #no rotation on y axis labels
xlab = NA,
ylab = NA)
xblocks(index(RUT[,4]), as.vector(stdev[,1]-stdev[,2]/rs > - 0.0125),col = rgb[3])
#connect the axes
abline(h=par("usr")[3]) #extend y axis
abline(v=par("usr")[1]) #extend x axis
abline(h=pretty(par("yaxp")),lty=1,lwd=2,col="white") #try something new for gridlines
title(main="Russell 2000 (source: Yahoo! Finance)",outer=TRUE, line=-2, adj=0.05)
mtext("blocks denote periods where Russell 2000 50 day sd low compared to S&P 500 sd",side=3,adj=0.05,cex=0.7,font=3, line=-1.5)