Skip to content

Percentile Channel Strategy Replication

February 16, 2015

Michael Kapler of the always excellent Systematic Investor blog has moved his publishing to GitHub to make it easier to post code. This has flown under the radar (even to me), and we are all grateful that he is back to publishing. He was able to reproduce the “Simple Tactical Asset Allocation with Percentile Channel Strategy” in his recent post here.

The table below compares the original strategy (channel rp) to other benchmarks including 1)ew- equal weight the assets in the portfolio 2)rp- risk parity using the assets in the portfolio and 3) channel ew: the percentile channel TAA strategy using equal weighting 4) QATAA- which is the application of Mebane Faber’s trend-following strategy cited in his now famous paper- A Quantitative Approach to Tactical Asset Allocation (in this case QATAA uses the same underlying assets and cash allocation as the percentile TAA strategy). Of course QATAA is one of the inspirations for the strategy framework and Meb always manages to publish interesting ideas on his World Beta blog. To avoid issues with different sources of extended data, Systematic Investor begins the test in 2010 using the underlying ETF data to show how the strategies have performed in the current bull market. If you are getting results in line with this test than you can feel comfortable that you have the details correct- if not you can use R and the code provided by Systematic Investor in the post.

channel strategy replication

After comparing results, Michael and I show an near identical match (I also get a sharpe of 1.42 and a CAGR of 8.93%) – a relief after all the commotion caused by the initial post (which was addressed in my now amusing rant over here). The original strategy is the best performer of the bunch since it applies multiple time frames as well as normalized bet sizing via risk parity (common for most trend-followers). As I have stated before, of the reasons I like the Percentile Channel approach is that the signals are likely to be slightly different from what most asset managers and investors are using.

New Channel Concepts: Volatility-Adjusted Time Series

February 12, 2015

vol33

In the last several posts, I introduced some different methods for channel strategies including Percentile Channels. A simple way to potentially improve (or at least take a different approach) to a donchian channel strategy is to use a different price input to generate trading signals. As stated in Error-Adjusted Momentum Redux, using any type of risk adjustment tends to improve performance by reducing some of the noise. That is easy to apply when using returns, but how do we apply this concept to a price-based strategy? Actually it is quite simple: using a fixed target percentage- say 1%- you multiply all returns since inception by the target divided by some lag of standard deviation. Then you create an index of those returns which becomes the new price series (being careful to avoid any lookahead bias). This volatility-adjusted index is what generates the signals for your channel strategy instead of the traditional price history. Of course in backtesting, you receive returns on the actual price history and not on the volatility-adjusted index. As a final point of clarification, you are not changing your position size as a function of volatility, instead you are just changing the input price.

So lets compare using a traditional 120-day Donchian Channel strategy that buys the S&P500 on new 120-day highs and sells and goes to cash (SHY) on 120-day lows versus the same strategy using a volatility-adjusted time series to generate signals. The lookback is a 20-day standard deviation to adjust daily returns to create the index (with a .75% vol target–note the choice of target doesn’t alter performance just the scale of the index). For this test we use SPY with data from Yahoo, and SHY with data extended from Morningstar. Note that the red line is NOT the equity curve of the strategy, but rather the Volatility-Adjusted Index created using SPY. The performance of the strategy using the index for signals is also highlighted in red:

SPY VOLLY

In this case, performance is improved using the volatility-adjusted index for signals versus the actual SPY price. Here is the same strategy using DBC with the ETF data only (since the choice of extension of DBC can create significant variability in performance):

DBC with VOLLY

The strategy shows some promise and generates different signals at certain times than the traditional strategy. Perhaps using different risk metrics such as acceleration or using other filtering techniques may hold even more promise. This same concept can be applied with moving averages or any other time of price-based signal. Just another concept for the diligent researcher to experiment with. Perhaps applying fractals to generate charts may be another useful avenue of exploration.

A “Simple” Tactical Asset Allocation Portfolio with Percentile Channels (for Dummies)

February 8, 2015

For Dummies

I actually received a large volume of what could best be chararcterized as “hate mail” for one of the previous posts on percentile channels. In reading these comments I was reminded of Jimmy Kimmel’s funny segments where celebrities read mean tweets about themselves. While I did not publish these comments (I do not wish to alienate or prohibit those people who are kind enough to comment on the blog), needless to say most of them implied that I had presented a fraudulent strategy that badly misrepresented true performance. Since exact details were not provided on the strategy this is a difficult claim to justify. As a mountain of such comments piled in, I decided that it would be useful at this time to clarify how the allocations were calculated. The initial strategy was developed using a pre-built testing platform in VBA, so presenting the details for how the strategy calculates positions is easier than taking the time to build it in a spreadsheet.

It is rare that I present a formal strategy on this blog for several good reasons: 1) this is a blog for new ideas to inspire new strategies not for sharing code or spoon-feeding people with recipes 2) people actually pay money for strategies in the quantitative investment business, and giving one away for free seems like a pretty good deal. Who ever complains about free food? Hint: No one. 3) whenever I post strategies or indicators I get flooded with demands for spreadsheets and code. The tone of such emails is often terse or even desperate and implies that I have some sort of obligation to assist readers with replication or implementation on their end. Since the blog is free and competes for my often limited time while engaging in unrelated but paid business activities, meeting such demands is difficult to justify. I would comment that even the authors of academic articles to reputable journals rarely provide either: a) easy instructions for replication–in fact it is notoriously difficult to replicate most studies since either the instructions are vague or details are missing or b) assistance/support— authors rarely if ever provide assistance with replication and rarely answer such requests, even when their articles are supposed to contribute to some body of research (unlike a blog). I would like to think that CSSA has been considerably more generous over the years.

As a former professor of mine used to say: “I think you are responsible for doing your own homework and using your own brain”– perhaps a novel concept to those who simply wish to coast of the hard work and individual thinking of others. So without turning this into a prolonged rant, here is a “simple” (I will refrain from using that word in the future after the latest experience) schematic of how allocations are calculated for the strategy:

A couple key details first- the strategy was rebalanced monthly (allocations calculated and held through the month) and not daily. Also, the strategy is LONG ONLY. This means that any negative positions are ignored. The channel score or signals in the initial calculation can be long or short ie 1 or -1. This is probably the key reason why readers were unable to replicate since they probably used 1 or 0.

tactical for dummies

Notice that negative positions are used to calculate allocations but are ignored in the final calculations. Furthermore, the cash position is allocated as an excess to the total of active allocations and not included in the risk parity position sizing (which would make SHY a huge position due to its low volatility). So I hope that this helps reader’s implement/duplicate the strategy. Keep in mind that prior to 2006, some of the ETFs used had to be extended with other data which reader’s may not have access to. However, using ETF data only yields a sharpe ratio of about 1.5. Beyond this- readers are on their own. Good Luck!

Error-Adjusted Momentum Redux

February 2, 2015

James Picerno of Capital Spectator recently did a good review of Error-Adjusted Momentum in his post “A Momentum-Based Trading Signal with Strategic Value“. The Capital Spectator blog is rich with great content covering a diverse range of subjects from economics to asset allocation and investment strategy. Picerno has published numerous books, but my favorite is Dynamic Asset Allocation which has a handy place on my bookshelf. Dynamic Asset Allocation is a good review of the case for a tactical approach to portfolio management.

To add some new ideas on the error-adjusted momentum strategy, I would suggest readers experiment with multiple time windows (ie the averaging period) and error lookbacks as well as data points with different frequencies from intraday,daily or even weekly and aggregate their signals to increase robustness. Risk or volatility can be substituted or also used in place of the error adjustment. The general concept of standardizing returns in some way to account for changing variance/error creates an effective non-linear filter that is a superior substitute to an adaptive moving average. In contrast, a typical adaptive moving average approach attempts to vary the lookback window (make the moving average faster or slower) as a function of some indicator. Academic studies on moving averages show that this type of approach demonstrates little success with a wide range of time series data outside of financial markets.

I have personally tried virtually every method I could find with an adaptive moving average framework and have had no material success. Part of the problem is that shifting to shorter-term moving averages increases standard error because you are using less data. Furthermore, by ignoring older data and shifting to a shorter window, you assume that there is no memory from changes in the dynamics of the time series. The success of volatility forecasting methods demonstrate in part that the influence of changes in the time series decay over time rather than all at once. The error-adjusted momentum approach is a nonlinear filter, and in general this class of methods tend to work better in my experience with financial time series. This particular filter permits a sufficient lookback window for averaging to achieve a good estimate (from a statistical sample size perspective) and retains information from dynamics that have evolved over time. The key is that it simultaneously manages to emphasize/de-emphasize portions of the data set based on the observed error (or some other metric). Substituting a weighted moving average in place of a simple moving average in the filter can also better capture the path dependence of changes in error.

As with any approach there are many different ways to apply the same concept, and readers are encouraged to experiment. The caveat is that it is better to use multiple approaches in an ensemble than to select the very best approach– the more things we try via experimentation (especially if there is no logical theory/hypothesis attached to it), the greater the risk of data-mining. A favorite quote from one of good blogs that I follow- Volatility Made Simple– says it best: “the concepts being exploited are much more important than the specific parameters chosen. All sets of parameters will, over the long-term, rise or fall together based on the success or failure of the core concept.”

A Simple Tactical Asset Allocation Portfolio with Percentile Channels

January 26, 2015

I prefer presenting new tools and concepts, but I know that there are a lot of readers that would like to see how they can be applied to creating strategies. So here is a very simple strategy that applies Percentile Channels from the last post to a tactical asset allocation strategy. The strategy starts with only 4 diversified asset classes:

Equities– VTI (or SPY)
Real Estate– IYR (or ICF)
Corporate Bonds– LQD
Commodities–DBC

for Cash we will use SHY

Here are the rules:

1) Use 60,120,180, 252-day percentile channels- corresponding to 3,6,9 and 12 months in the momentum literature- (4 separate systems) with a .75 long entry and .25 exit threshold with long triggered above .75 and holding through until exiting below .25 (just like in the previous post)
2) If the indicator shows that you should be in cash, hold SHY
3) Use 20-day historical volatility for risk parity position-sizing among active assets (no leverage is used). This is 1/volatility (asset A) divided by the sum of 1/volatility for all assets to determine the position size.
4) rebalance monthly

Here are the results for this simple strategy:

p strategy

This is a very consistent strategy which is more notable for its low maximum drawdown and high sharpe ratio (near 2) than its sexy returns. Of course there are many alternatives to “spice” this up by varying the allocation among instruments, changing instruments or using leverage. I wanted to keep the asset list short and simple, and I chose corporate bonds since they provide some of the defensive characteristics of treasurys but with a higher yields and arguably lower systematic risk (no sovereign risk). Substituting the 10-year treasury with IEF instead of corporate bonds produces nearly identical results (1.9 sharpe, 11.8% Cagr, 5.8% max dd). There were better combinations of asset classes and parameters, but this compact list seemed manageable for a self-directed investor without a large portfolio.This is not the ultimate strategy by any means, but shows how to use percentile channels to produce a viable approach to tactical asset allocation.

Percentile Channels: A New Twist On a Trend-Following Favorite

January 21, 2015

One of the most widely used trend-following approaches are Donchian Channels which were popularized by the famous “Turtle Traders.” In fact, it was the subject of Donchian Channels that started my collaboration with Corey Rittenhouse with the popular post Percent Exposure Donchian Channel Method. One of the original turtle systems used a 55-day donchian channel that bought at new 55-day highs and sold at new 55-day lows. This system- along with many other popular systems- suffered an erosion in profitability as other people copied the same approach. What has often fascinated me is how one might go about front-running such systems to achieve superior profitability. While I was thinking about this concept, I theorized that entering prior to new highs or lows might create an early entry that would be sufficient to avoid false breakouts induced by system traders. As an alternative one could use Percentile Channels- which function the same as Donchian Channels but instead use the percentile of the price specified instead of a maximum or minimum. Below is a picture comparing percentile channels to donchian channels:

channels

For a fun experiment I decided to run a test using the Commodity Index (DBC- extended with index data) as a rough proxy for a trend-follower’s portfolio with Donchian Channels versus Percentile Channels. The original 55-day Donchian Channel is used to trade long or short on new highs/lows, versus a 55-day Percentile Channel with a 75th and 25th percentile threshold.
The results from 1995-2014 are presented below:

channel

Interestingly enough, the percentile channels help to revive a broken system with earlier entries. Another turtle system–perhaps the most famous- used the 20-day Donchian Channel. For added robustness, lets see how percentile channels might revive this long-broken system:

percentile channel2

While this isn’t a perfect proxy for a futures/trend-following portfolio, the results show that it is possible to revive old systems based on new highs and lows using a less restrictive percentile channel approach. This leads to earlier entries that avoid the noise generated from competing signals. Regardless, percentile channels are just another tool for trend-following and can create a wider range of support/resistance type systems by varying the chosen entry/exit threshold.

Distance Weighted Moving Averages (DWMA and IDWMA)

December 18, 2014

The distance weighted moving average is another nonlinear filter that provides the basis for further research and exploration. In its traditional form, a distance weighted moving average (DWMA) is designed to be a robust version of a moving average to reduce the impact of outliers. Here is the calculation from the Encyclopedia of Math:

dwma

Notice in the example above that “12” is clearly an outlier relative to the other data points and is therefore assigned less weight in the final average. The advantage of this approach to simple winsorization (omitting outliers that are identified from the calculation) is that all of the data is used and no arbitrary threshold needs to be specified. This is especially valuable for multi-dimensional data. By squaring the distance values in the calculation of the DWMA instead of simply taking the absolute value, it is possible to make the average even more insensitive to outliers. Notice that this concept can be also reversed to emphasize outliers or simply larger data points. This can be done by removing the need to invert the distance as a fraction and simply using the distance weights. This can be called an “inverse distance moving average” or IDWMA, and is useful in situations where you want to ignore small moves in time series which can be considered “white noise” and instead make the average more responsive to breakouts. Furthermore, this method may prove more valuable for use in volatility calculations where sensitivity to risk is important. The chart below shows how these different moving averages respond to a fictitious time series with outliers:

dwma2

Notice that the DWMA is the least sensitive to the price moves and large jumps, while the IDWMA is the most sensitive. Comparatively the SMA response is in between both the DWMA and IDWMA. The key is that neither moving average is superior to one another per se, but rather each is valuable for different applications and can perform better or worse on different time series. With that statement, lets look at some practical examples. My preference is typically to use returns rather than prices, so in this case we will look at applying the different moving average variations: the DWMA,IDWMA and SMA to two different time series- the S&P500 and Gold. Traders and investors readily acknowledge that the S&P500 is fairly noisy- especially in the short-term. In contrast, Gold tends to be unpredictable using long-term measurements, but large moves tend to be predictable in the short-term. Here is the performance using a 10-day moving average with the different variations from 1995 to present. The rules are long if the average is above zero and cash if it is below (no interest on cash is assumed in this case):

dwma3

dwma4

Consistent with anecdotal observation, the DWMA performs the best on the S&P500 by filtering out large noisy or mean-reverting price movements. The IDWMA in contrast performs the worst because it distorts the average by emphasizing these moves. But the pattern is completely different with Gold. In this case the IDWMA benefits from highlighting these large (and apparently useful trend signals), while the DWMA performs the worst. In both cases the SMA has middling performance. One of the disadvantages of a distance weighted moving average is that the calculation ignores the position in time of each data point. An outlier is less relevant if it occurs for example over 60 days ago versus one that occurs today. This aspect can be addressed through clever manipulation of the calculation. However the main takeaway is that it is possible to use different weighting schemes for a moving average for different time series and achieve potentially superior results. Perhaps an adaptive approach would yield good results. Furthermore, careful thought should go into the appropriate moving average calculation for different types of applications. For example, you may wish to use the DWMA instead of the median to calculate correlations- which can be badly distorted by outliers. Perhaps using a DWMA for performance or trade statistics makes sense as well. As mentioned earlier, using an IDWMA is helpful for volatility-based calculations in many cases. Consider this a very simple tool to add to your quant toolbox.

Follow

Get every new post delivered to your Inbox.

Join 805 other followers