Products Platforms
What's New? Support
SALES Dept. Feedback
FAQ FREE stuff
Tech Reports Snake Oil !

Home PageJRC Logo


Building Trading Systems


Up 1 Level

  Keys to Successful Trading
  Pre-Built Trading Systems
  Brief Overview of System Building
 Money Management
  Leading Indicators and Modeling
fighter jet


No, the keys to success are not our products, nor anyone else's. Rather, to be a successful trader you need ...

  • a trading system with profitable expectation,
  • sound money management principles,
  • the psychological fortitude to trade consistently, and
  • adequate capitalization.

Contrary to popular belief, your basic trading system only needs to be modestly profitable. A proper money management scheme designed to control your "bet size" can expand those meager profits substantially. Also, since human nature tends towards taking profits too soon and letting losses run too far, you need to be familiar with the market and not be emotionally caught up and then too afraid to follow your trading system's recommendations.

Therefore, we offer no get-rich-quick schemes. They do not work. Nor will we insult you with suggestions that if a billion dollar Capital Management firm hooked directly to Wall Street with a vast computer facility and scores of PhD's can make millions using product X, then so will you. You probably won't.

Nor can we promise the markets are so inefficient that it's easy to take in profits. It's not, simply because you will be competing with other very smart players, who want YOUR money.

On the other hand, we do offer powerful tools and educational products to help individual investors, like you, succeed in making an effective trading system. Jurik tools are compatible with many software products. Our satisfied customers agree!


It is a mistake to assume trading systems described in books, magazines or in your daily junk mail are profitable. They need to be tested over a prolonged period of historical data (enough for at least 500 trades). The best single test you can apply to any trading strategy for sale is this:

Will the seller provide a broker's statement
showing the most recent 200 consecutive
trades called by the strategy?

If the seller is unwilling or unable to do so, walk away.

If you are supplied with a broker's statement, plot the equity curve and see if you can handle (financially and emotionally) any strings of losses. Also, try to get a scatterplot containing the maximum adverse excursion of every trade. Sometimes a trade first loses big before becoming profitable. Can you handle those situations properly?

When the market changes its behavior, system performance may degrade into a loser. Will you have to pay additional $$ for periodic upgrades?

Finally, how much do you expect to learn about trading from a system you cannot analyze nor modify?

We believe you are better off making your own trading system than buying one. Your system will be designed around your financial resources and psychological comfort zone. And you will be able to modify it according to changing market conditions. Last, but not least, you will know exactly how well it can be expected to perform.

of System Building

Consider getting market charting software compatible with Jurik tools.

The next step is to acquire our JMAadd-in. JMA has the greatest number of uses, smoothing prices and other technical indicators with very little lag. Users have found new applications by exploiting JMA's ultra-smooth lines.

Our other advanced trading tools, CFB, VEL and RSX, further enhance trading system design by offering new ways to measure price action behavior. CFB measures market trend duration (no classic indicator does this). VEL delivers an ultra-smooth measure of market momentum, with no more lag than the classical momentum indicator. RSX is Jurik's version of the classic RSI, except that RSX is also ultra-smooth. When you see RSX, you will never want to use RSI again!!

Once you are comfortable with building trading systems using concurrent (price) and lagging (classical) indicators, you may now want to expand your capabilities by adding LEADING indicators. Of course, all popular leading indicators are worthless as the market has already discounted the information they offer. Instead, you will need to create your own leading indicators.

A leading indicator is supposed to forecast some aspect of market behavior. These days, sophisticated non-linear modeling procedures (such as neural networks) are required. To get started, we recommend you acquire and become familiar with the spreadsheet application Excel, by Microsoft. Then acquire a neural network add-in to Excel. There are several on the market.

After familiarizing yourself with neural net development, get experience building leading indicators and using our pre-processing tools for MS Excel.

Four qualities of great technical indicators

Almost all technical indicators involve taking some form of an average of historical values in order to reduce market noise, which appears as high-speed jitter. Analysts typically ignore noise jitter because it has neither trend nor repeatable patterns. Consequently, most moving averages have a "length" parameter which effectively controls the indicator's apparent smoothness and, in an inverse way, its accuracy. That is, the smoother a filter becomes, the less it accurately reflects local market action.

This makes sense, since the user may define jitter to be any action that trends less than N bars. Therefore, we see the market player trying to apply just enough smoothness to filter out noise without removing important structure that is relevant in his desired time frame. In short, ...

With almost all technical analysis indicators, the user
makes a tradeoff between smoothness and accuracy.

Accuracy can be measured in several ways: robustness, overshoot, timeliness, and proximity. These measures will be described in the context of a hypothetical moving average filter.


Simply put, you want a filter (e.g. moving average) to produce a noise-free version of the original signal, whereby the overall curve is neither higher nor lower than the original series. A result similar to what you would produce when given a pen and manually "traced through" the important market action.


All technical indicators that strictly examine past data values (i.e. do not look into the future) are called "causal". These are the only ones available to you when trading the market in real-time. All causal filters have a fundamental problem: they lag behind the original time series. Lag in your technical indicators only serves to delay what you need to see right now. This is a critical issue because excessive delay and late trades may reduce profits significantly.

Ideally, you would like a filtered signal to be both smooth and lag free. However, for all causal filters, greater smoothness produces greater lag and there is no "penalty free" way around it. Attaining smoothness without adding significant lag or other unwanted idiosyncracies has baffled financial analysts as well as signal processing folks for years. We at Jurik Research understand the nature of lag very well, and employ proprietary formulas that address this fundamental issue.


One common approach toward reducing lag is to add some "inertia" into the formula, enabling a filter to follow trends more closely without sacrificing smoothness. However, the penalty paid is when a market quickly reverses direction. The filter's inertia prevents it from quickly changing direction, and continues to overshoot for some time before reversing direction. The more inertia you apply, the greater the overshoot. ... And this can create a real problem.

Some trades are triggered when a moving average of price crosses a user specified threshold. For example, suppose price trends up toward a threshold but reverses direction just in time to not actually break threshold. A filter with too much inertia will overshoot and break threshold, even though price did not. This false trigger may produce an unwanted trade.


To remove noise in a time series, common filters use mathematical techniques that have been around for years. The underlying theory in almost all cases assumes changes in market prices have a Normal (Gaussian) distribution. This may be true for noise in your car radio or cassette tape recorder, but not for the market. Gaps in market prices occur more frequently, by orders of magnitude, than the Gaussian curve suggests. Consequently, common filters respond to price shocks very poorly.

Players need a filter that is robust against price shocks. This calls for a special kind of signal processing called "nonlinear" filtering. Our flagship tool, JMA, is such a filter and can handle price shocks better than any other moving average available on the market today. In fact, the bigger the gap, the more obvious JMA's superiority becomes evident.

Jurik Research has achieved these results by first developing and testing algorithms in MATLAB, the engineer's choice for software simulation. We try to avoid making any assumptions about the signal being processed, other than that it is a random walk of accumulated Cauchy (not Gaussian!) distributed price changes. This way the algorithms cannot be fooled by atypical market action. Next, we let qualified beta testers look for problems. Finally, after we make each product available...

Jurik Research offers monetary reward to each
person who is the first to report any specific
error in our software or documentation.

At Jurik Research, there is no substitute for excellence.

Sequence for advanced system building


If some humans can trade consistently well, then why can't a computer? Why can't it be your computer? Shades of artificial intelligence, have not we heard these questions before? Artificial Intelligence, regardless of its formal definition (if it ever had any), translates to hard, and often fruitless, work. Persistence does pay off, however. Structured methodology and systematic experimentation is the recommended modus operandi.

We define an advanced system as one that includes some aspect of a leading indicator, which implies forecasting is involved. Leading indicators could be designed for almost anything, but we prefer using it to forecast an upper and lower price range as well as future MACD values. Proper leading indicator development calls for preprocessing with WAV and DDR and modeling with a neural network program. Lastly, all this needs to be performed in a systematic fashion.

To accomplish this, I designed this flow diagram to see the big picture. It subdivides trading system development effort into various stages. Here is a multi-stage review of our advanced system building process. You may change any aspect of it to suit your particular needs.


Here is a description of how I build my own trading systems. The flow chart shows six stages of trading system development :

  1. Select explanatory data (collection stage)
  2. Create low-lag indicators (preprocessing stage)
  3. Create leading indicators (modeling stage)
  4. Build your trading system (strategy stage)
  5. Backtest your trading system (verification stage 1)
  6. Trade with a simulated broker (verification stage 2)

This involves the unexciting task of collecting and verifying financial data. It does not help your system's self-image to give it historical prices peppered with blanks and zeroes. Eyeball it for any problems.

Research has shown that if you convert price data to the LOG (logarithm) of price data, strategies will work better over a longer period. This is because price data is now expressed in a multiplicative relation to each other, rather than additive, and this tends to be preserved as prices change scale over time.


This stage involves data preprocessing. Briefly, this is where we extract meaningful indicators from raw financial data. Good preprocessing makes the next stage (modeling) run smoothly. Professional modelers realize the importance of this step and focus most of their energy here. However, to the amateur it has the same appeal as washing laundry.

Determine the optimal "forecast horizon" for the time series to be predicted. For example, the optimal distance to forecast into the future when using daily bars of 30-Year T-Bonds is 5.5 days. This value varies from market to market and the method for calculating it is explained in my book Financial Forecasting and Neural Networks.

Determine how much historical data is needed to make a SINGLE forecast. I refer to this amount of historical time as the "lookback horizon" and its size is typically 4 times the forecast horizon. For example, if my forecast is to predict 5.5 bars into the future, then my lookback horizon (L) for each forecast will need to be 22 bars. (L=22) All indicators need to consider the activity of at least the most recent L bars.

Select appropriate explanatory data such as highs, lows, volume, etc. I encourage you to investigate pre-smoothing price data first with JMA, thereby creating "proxies" for the raw price values. Next, create relevant indicators (RSX, VEL, CFB, channels, JMA-MACD, etc) by applying them to the JMA proxies, instead of the raw price data. Set the "length" parameter of your indicators so that the number of bars considered by each formula is approximately the lookback horizon (L).

Make sure each column of indicator values resembles a zero-mean, standardized oscillator (i.e. Z-score series), and is not a random walk (e.g. raw market prices). This is because a random walk will eventually enter a range the model has not seen during development, inducing failure.

Apply WAV to the above indicators, in order to compress the most recent L values of each indicator into a much smaller number of values. For example, WAV can compress the most recent 73 values of an indicator into just 13, a compression of 82%! When building forecast models, it is important to reduce the number of input variables as much as possible, preferably without losing valuable information in the process.

Gather the time compressed values of each indicator (i.e. WAV's output) into an array (one column per indicator) and apply DDR. This procedure reduces the number of columns in the array by extracting out all redundancy between columns. The result is an array with much fewer columns, all columns are mutually uncorrelated (each column is carrying different information), and little to no information was lost in the process.

At this point, your data is both temporally and spatially compressed. If your model had to receive the most recent 73 values of each of 10 indicators without spatio-temporal compression, your forecast model would be looking at an input array of 730 values for each forecast. However, after spatio-temporal compression, the new array would likely be 13 values for each of only 4 columns, just 52 values total. This represents a final compression of 93% !!


Stage 3 is where you get to play with and learn about sexy modeling tools such as ARIMA, expert systems, genetic algorithms and neural networks. Typically, the novice will completely skip stage 2 and spend months trying to make it all happen in stage 3. This leads to complaints that the [expletive deleted] neural net is brain-dead.

Choose what you want the model to forecast. Keep it simple, like estimating the MACD five bars out, or estimating resistance and support (relative to the current average price) 10 bars out. Avoid attempts to forecast raw market prices (unless you are really good at predicting pseudo-random variables). Make sure your column of forecast target values resembles a zero-mean, standardized oscillator (i.e. Z-score series), and is not a random walk (e.g. raw market prices). This is because a random walk will eventually enter a range the model has not seen during development, inducing failure.

Feed the compressed array you created in stage 2 and target data to your model. Verify all models with data that was not used during development. As a rule of thumb, for each input (independent) variable fed into your model, you will need sufficient training and verification data to support at least 100 forecasts. Thus if your model receives 54 input variables per forecast, you need enough data to support 100*54 or 5,400 forecasts during model creation and verification.

Feed the compressed array you created in stage 2 and target data to your model. Verify all models with data that was not used during development. As a rule of thumb, for each input (independent) variable fed into your model, you will need sufficient training and verification data to support at least 100 forecasts. Thus if your model receives 54 input variables per forecast, you need enough data to support 100*54 or 5,400 forecasts during model creation and verification.

Information about different paradigms for modeling leading indicators is provided further down this page. (Keep reading ... you will get there).


This stage is for developing trading logic. It is the most "fun" part of system building, providing you know what you are doing. There are many books on this topic. With regards to using forecast models, here are a few pointers:

Create rules for risk and money management. There are books to help you with this subject.

One clever risk management technique is to create multiple stochastically trained models (e.g. neural networks) for making the same forecast. When all models are in strong agreement, increase your risk. When they are in strong disagreement, lower your risk.


During backtesting, pore over statistics such as return on account (considering maximum drawdown), maximum adverse excursion charts, Monte Carlo simulations of expected fiscal half-life, etc. In doing so, look for the system's bad trades and conjure up design modifications.

Consider how many variables, constants and lines of code you are tweaking (optimizing). Each one is a degree of freedom you are playing with. When backtesting, use sufficient market data for the system to create 100 trades for each degree of freedom. Thus if you are optimizing 5 constants and tweaking 4 lines of code, verification calls for each run to produce at least 100*(4+5) or 900 trades.

Be mindful about optimizing trading systems. Undisciplined and excessive conjuring of code may lead to over-optimized spaghetti logic, a nightmare to maintain. Also, too much optimization will yield great performance on your current data set, but miserable performace on future data. Our book Financial Forecasting and Neural Networks and audio tape Space, Time, Cycles and Phase offer an explanation of this phenomenon.

for an explanation of this phenomenon. A system that trades well on both historical data and future data is most desirable.


During live "paper trading", keep an eye out for how quickly the system degrades. This suggests how frequently the models need to be updated. It may also suggest poor trading logic.

One example of a neural network enhanced trading system that ran well, without retraining, for many months after its development, is described in the December 1996 issue of Futures Magazine. Although the test and verification procedure used by the author was not the best, the result proved to be profitable nonetheless.


You really do not need to optimize the heck out of your trading system as long as you employ good risk management. It addresses the question: how much are you putting at risk in a trade versus the expected profit for taking that risk? Like an expert poker player, with proper money management you evaluate how much to invest and how much you are willing to lose on each gamble. Therefore, the basic principle of money management is risk management. Opening positions with risk covered is fundamental to successful trading. In other words, manage the risk first and profits will follow when your bet is correct. It's amazing how much this discipline can improve your system's overall profitability. Over a period of years, this technique can improve trading profits more than ten-fold!

Some books on money management are listed HERE.


Leading Indicators and Modeling

  Why leading indicators are difficult to make.
  What are neural nets? Any applications ?
  What are genetic algorithms?
fighter jet

are difficult to make


The "Composite of Leading Economic Indicators" is valued by the Federal Reserve and long term investors for its forecast potential. In contrast, speculative investors prefer to use technical and fundamental indicators with short term forecast potential. The problem is that almost all commonly used indicators, (MACD, ADX, CCI, RSI, etc.) look behind and summarize what has occurred, not what will occur.

The rarity of good short term leading indicators tells us that they are difficult to produce, and more importantly, because so few investors exploit them, these indicators can yield a significant trading advantage. But why are they so rare? What's so difficult about creating a short term leading indicator?

"If all economists were laid end to end,
they would still point in all directions."
-- Arthur H. Motley

The reason for their rarity is due, in part, to the nature of markets. In the past, when trading was not dominated by computers, most financial analysts used macro and micro economic theory as well as classical "linear" modeling techniques. Traditional market models, based on linear theory and techniques and their simplifying assumptions, are making forecasts increasingly inaccurate each year. Wall Street analysts have consistently missed every major turning point in the market for the past 30 years. For example, six months before the 1990 recession, 34 out of 40 economists agreed "the economy will probably avoid a recession". Also, just two weeks before the huge bull market in 1991, the consensus of these 40 economists was: "the economy will shrink for the next six months."

Traders and investors using systems based on classical analysis will also suffer serious losses when market conditions change too quickly for their models to "comprehend".

Jurik Research believes the problems with traditional market models stem from their assumptions, which I divide into three categories ...


Linear models work best when their input variables are independent (not correlated with each other). Highly correlated input variables can lead to models that appear to work well on historical data, but which will fail miserably on new data. Such interdependencies do exist (e.g. the inverse relationship between commodities and bonds) and models that fail to account for this fact will have problems.


Today the market moves faster and more chaotically, exhibiting disjointed, nonlinear relationships between market forces.


To keep life simple, analysts assume all traders and investors are risk averse, rational and react in similar fashion. In reality, floor traders, short and long term traders, fund managers, hedgers, program traders and market makers all use different levels of risk and react in different time frames.

Clearly, we need a new family of models that can simulate nonlinear relations and players thinking in different timeframes. Consequently, efforts to find and exploit profitable niches in the markets are foregoing classical techniques for more powerful trading methods. New tools using artificial intelligence methods are increasing in popularity. These tools include neural networks and genetic algorithms.

Now that easy-to-use versions of both paradigms are currently available as add-ins to Microsoft Excel, the public is quickly catching on: its not so hard after all.

Do They Really Work?


A neural network (or NN) is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. Each element executes a mathematical formula, whose coefficients are "learned" when given examples of how the NN should respond to various data sets. Applications include data pattern recognition or classification.

During a "training" session, the NN produces a collection of simple nonlinear mathematical functions that mutually feed numerical values to each other in a way that vaguely resembles neural brain cell activity. The interaction between neurons can become so complex that that knowledge of the mathematical formulas offers little to no insight into the model's overall "logic". Consequently, as long as the neural network performs well, its user rarely cares to know what exact equations are inside.

Be careful not to confuse neural networks (NN) with another artificial intelligence paradigm called expert systems(ES). ES programs are designed to mimic rational thinking as described by experts. However, if the expert cannot express his logic in a way that reliably yields correct decisions, the ES paradigm cannot be effectively employed. In contrast, a NN is not concerned with emulating human logic. A NN simply tries to map numerical input to output data. The mistaken belief that NN and ES paradigms are similar inevitably leads to the incorrect argument that if ES models perform poorly, then so will NN models. Fortunately, NN models are performing well in the real world.


In the commercial world neural networks are being used to ...

  • manage portfolio risk
  • assess loan credit risk
  • detect credit card fraud
  • forecast potato chip sales
  • detect unhealthy blood cells
  • optimize job shop scheduling
  • forecast financial market activity
  • optimize cold rolling of sheet metal
  • remove annoying telephone echoes
  • determine optimal prices for merchandise
  • detect explosives within luggage at airports
  • predict outcomes of new formulas for plastic

Don't expect a NN to do all the work for you and produce Buy/Sell signals. NNs must be coupled with traditional technical analysis, and best results come from experienced traders. That's because they understand which market indicators are more significant and also how to best interpret them. Therefore, it is best to design a NN to produce meaningful technical indicators, not a "Buy/Sell" holy grail.

The flow chart shows six stages of trading system development. Neural nets are typically used in the third, or MODELING stage. In this stage, neural nets are trained to model some aspect of the market, to classify either current or future market conditions, thereby telling the investor when to get in or out of the market. When forecasting future conditions, they are technically a "leading indicator".


There are many neural net packages available commercially. Many interact with the Microsoft Excel environment.


Because our standards of integrity are very high, at the risk of losing a sale, we feel compelled to mention the following. We do not imply that developing a neural network is an easy one-night stand. It will take time, and not everyone has the time to do so. Nor is a neural net by itself a trading system. Proper system development still requires the usual human effort, including:

  • Selecting the best information
  • Building and testing indicators
  • Interpreting the results
  • Deciding whether or not to place a trade
  • Deciding how much to invest (money management)

Details on issues and considerations when getting started is provided in this report, submitted to us by William Arnold, a contributing author to The Journal of Intelligent Technologies.

Lastly, questions arise as to how much a trader should trust a NN model. It will be difficult to trust your computer's decision to buy when fear in your mind cries out "Sell! Sell NOW!" Nevertheless, at conference after conference we hear users commenting that they would have made more money if they had not tried to outsmart and veto their system's decisions. After all, the whole purpose of building an artificially intelligent system is to avoid the same trades as the crowd, who on average, loses money in the market.


Yes, many. One money management firm worked intensively with neural networks since 1988. They use 3000 neural nets, one for each stock they trade. They use both neural networks and genetic algorithms to separately predict the behavior of individual stocks. Although recommendations from both "experts" substantially narrow their selection, they are further refined with the aid of portfolio analysis, in an attempt to limit overexposure to any one stock or sector. Their research has paid off well as they were, at one point, managing a half billion dollars.

Other institutions that implemented operational neural forecasting systems include Citibank, Nikko Securities, Morgan Stanley, Dai-ichi Kanyo Bank, Nomura Securities, Bear Stern and Shearson Lehman Hutton. Advanced Investment Technologies (AIT), in Clearwater, Fla., has one of the longest track records using neural networks.

Here are some articles on neural net for financial applications you can probably find in a library:

  • "Training Neural Nets for Intermarket Analysis", Futures, August 1994
  • "How to Predict Tomorrow's Indicators Today", Futures, May 1996
  • "Going Fishing With A Neural Network", Futures Magazine, Sept. 1992
  • "Forecasting T-Bill Rates with a Neural Network," Technical Analysis of Stocks and Commodities, May 1995
  • "Using Neural Nets for Intermarket Analysis", Technical Analysis of Stocks & Commodities, Nov. 1992
  • "Developing Neural Network Forecasters For Traders", Technical Analysis of Stocks & Commodities, April 1992
  • "A Neural Network Approach to Forecasting Financial Distress", Journal of Business Forecasting, v10, #4.
  • "Forecasting with Neural Networks: An Application Using Bankruptcy Data", Information and Management, 1993, pp 159-167.
  • "Forecasting S&P and Gold Futures Prices: An Application of Neural Networks", J. of Futures Markets, 1993, pp 631-643.
  • "Neural Nets and Stocks: Training a Predictive System", PC AI, 1993, pp 45-47.
  • "Using Artificial Neural Networks to Pick Stocks", Financial Analysts Journal, 1993, pp 21-27.
  • "Analysis of Small-Business Financial Statements Using Neural Nets", Journal of Accounting Auditing and Finance, 1995, pp 147-172.
  • "Stock Price Prediction Using Neural Networks: A Project Report" NeuroComputing, 1990, #2
  • "Forecasting Bankruptcies Using a Neural Network," International Business Schools Computing Quarterly, Spring 1995

In contrast to standard linear regression models, NNs perform nonlinear regression modeling, which is orders of magnitude more flexible and powerful. When a user wisely decides on a NN's task and feeds it market data needed to perform that task, the model has potential to perform well because it ...

  • is inherently nonlinear and can "train" better than linear models in this environment.
  • can learn to see better than humans the various relationships among large numbers of indicators.
  • is dispassionate and consistent; NNs know neither fear nor greed.
  • can be automatically retrained over and over to accommodate new behavior in the markets.

Making money with sophisticated technology is a dual-edged sword. Without careful data preparation, you can easily produce useless junk. The first mistake made by novices using neural networks, is they fail to search for the most relevent data. A few top notch indicators will deliver better results than a few hundred irrelevant ones.

The second common mistake is to think that feeding a neural net 100 indicators will deliver better results than feeding it only ten. But large numbers of inputs require a large model which is difficult to train and maintain. Reducing data to its most compact form (and thereby reducing the NN model to its most compact form) greatly improves chances of success.

Two critical ways to compress data are sparse historical sampling (temporal compression) and redundancy reduction (spatial compression). Many market indicators are redundant because they reflect the same market forces at work, so eliminating redundancy is purely advantageous. As for sparse historical sampling, it is important to find representative values for past points in time, but done in such a way so as not to let important price patterns be skipped.

  • Jurik's WAV performs sparse historical sampling (temporal compression).

  • Jurik's DDR performs redundancy reduction (spatial compression).

  • Here is a nice tutorial on neural nets. It is a Macromedia Flash interactive movie. Select topic from menu along the top of the movie screen.

Do They Really Work?


Genetic Algorithms (GAs) are a general purpose problem solving technique. First, several random answers to a problem are generated. The worst answers are eliminated, and the best are "mutated" and "cross-pollinated" with each other to create additional answers that closely resemble the first. The repeated process of elimination and regeneration gradually improves the quality of answers. In this way, they simulate the evolutionary process of "survival of the fittest." GAs are ideal for solving complicated problems with many independent (input) variables and a gigantic number of possible outcomes.


Genetic algorithms have been used to find the optimal . . .

  • Budget allocation
  • Job shop schedule
  • Chemical inventory
  • Starting conditions
  • Military response
  • Investment portfolio
  • Fuel consumption
  • Investment trading rules
  • Electronic circuit design

With regard to financial applications, genetic algorithm optimization has been applied to . . .

  • Portfolio Balancing & Optimization
  • Budget Forecasting
  • Investment Optimization
  • Payment Scheduling

Major banks are using a GA component in their loan evaluation programs, such as the one marketed by KiQ of London. Currency traders at Citibank are using GAs to select characteristics of sequences of financial data to more accurately predict their future behavior. Stock traders at Salomon Brothers are using GAs to search for optimal trading rule combinations. Fund managers at Fidelity Investments try to best bundle securities to satisfy constraints. First Quadrant manages a $10-billion portfolio of pension funds, and uses genetic algorithms to build investment models. Models built by genetic algorithms made $255 for every $100 invested, compared with the typical $205. Financial managers at Merrill Lynch use GAs to hedge clients' exposure to price changes in foreign exchange markets.


After you have preprocessed your financial data and developed technical indicators to your liking, your next step is probably to translate these numbers into trading decisions: buy, sell, hold, exit, swap, straddle, leap, etc. Unfortunately, these decisions may involve very complex rules, based on lots of contingencies. For example, one such rule may resemble the following ...

Buy long only when A is rising, and B is less than C,
and interest rates just crossed below D.

Neural nets cannot optimize complex trading rules very well. However, genetic algorithms can by employing the latest findings in genetic algorithm optimization (GAO). The concept of GAO comes from the resemblance of this process to genetic evolution. If we pretend the parameters A, B, C, ... are genomes (parts) of one large chromosome, then when nature mutates the chromosomes, through mating and reproduction, natural selection eliminates those organisms that perform poorly in the real world. Eventually, organisms with optimal and near-optimal chromosomes survive.

Suppose you had a collection of 30 rules for trading,
but you want only 10 or less.
Which rules do you eliminate?

You might write a program to evaluate all 53 million combinations, or you could use the GA method. GAs would try a number of random combinations of rules, toss out the combinations that performed poorly and make variations upon the collection of rules that performed well. Eventually, you are left with either the optimal or a near-optimal combination of rules.

{short description of image} {short description of image}Top of Page

[ Copyright ] [ WebMaster ]