Reimagining Classic Strategies (Part 19): Deep Dive Into Moving Average Crossovers
This article explores the classical moving average crossover strategy and offers the reader multiple alternative solution paths they could follow to overcome the conventional problems of the strategy. Among many other well-known issues, the strategy is known to be noisy, to give delayed trading signals, and to be widely exploited. In simpler terms, this means that the trading signals given by the traditional moving average crossover strategy can be reversed too quickly and too frequently for traders to reliably profit from the strategy. Additionally, many false breakouts trigger early entries.
The strategy, in its traditional form, appears to be susceptible to noise and has unreliable mechanisms for filtering out weak and unprofitable trades. The solutions we present here help to overcome the weaknesses of the traditional strategy by building filters for trading signals that are far more robust to market noise. In particular, in this article, we explore five different variations of solutions aimed at filtering the weak trades out of the signals generated by the crossovers.
The moving average crossovers appear to be fertile ground for our statistical models to learn from. The statistical models we built learned the error that remained in the moving average crossovers—error that we were unable to manually filter out ourselves by creatively thinking of better trading rules. There is obviously a natural limit to how far our human intuition can guide us in optimizing a strategy, but where intuition falls short, our statistical models can help us pick up the rest of the work to be done.
Getting Started in MQL5
For this particular discussion, since we are going to consider multiple versions of the same strategy, it is important that we use parameters that will be fixed throughout all backtests to avoid repeating the same information over and over in each iteration. Therefore, all five versions of the applications that we are going to test are kept in one folder, as shown below in Figure 1.

Figure 1: Visualizing all the versions of the moving average crossover strategy we will assess
The test dates that we will use across all tests will be fixed from January 2022 up until the time of writing in 2025, respectively.

Figure 2: The test dates we have selected for all versions of our trading strategy

Figure 3: The back test settings we have selected for
Establishing A Baseline
As with all processes, we begin by first establishing a baseline level of performance. To do so, we will implement the strategy in a widely accepted form that is commonly agreed upon by most traders. Therefore, the original strategy has two moving averages: one with a shorter period and the second with a longer period. In the traditional setup, when the faster moving average crosses above the slower moving average, we are inclined to take bullish positions, and when the converse is true, we look to take bearish positions in the market. This is the simplest version of the strategy possible that most traders agree with, and therefore we selected it as our baseline.
It offered us rules for entering and exiting trades, and our exit rules were then defined by the ATR to set our stop-loss and take-profit positions. We used equally spaced take-profits and stop-losses without any trailing settings to ensure that changes in profitability came from improved decision-making rules for our trade entries.
//+------------------------------------------------------------------+ //| MA Crossover V1.mq5 | //| Copyright 2025, MetaQuotes Ltd. | //| https://www.mql5.com | //+------------------------------------------------------------------+ #property copyright "Copyright 2025, MetaQuotes Ltd." #property link "https://www.mql5.com" #property version "1.00" //+------------------------------------------------------------------+ //| Technical Indicators | //+------------------------------------------------------------------+ int ma_fast_handler,ma_slow_handler,atr_handler; double ma_fast_reading[],ma_slow_reading[],atr_reading[]; //+------------------------------------------------------------------+ //| Global variables | //+------------------------------------------------------------------+ double ask,bid; //+------------------------------------------------------------------+ //| Libraries | //+------------------------------------------------------------------+ #include <Trade\Trade.mqh> CTrade Trade; //+------------------------------------------------------------------+ //| Expert initialization function | //+------------------------------------------------------------------+ int OnInit() { //--- Setup our indicators ma_fast_handler = iMA("EURUSD",PERIOD_D1,30,0,MODE_SMA,PRICE_CLOSE); ma_slow_handler = iMA("EURUSD",PERIOD_D1,60,0,MODE_SMA,PRICE_CLOSE); atr_handler = iATR("EURUSD",PERIOD_D1,14); //--- return(INIT_SUCCEEDED); } //+------------------------------------------------------------------+ //| Expert deinitialization function | //+------------------------------------------------------------------+ void OnDeinit(const int reason) { //--- Free up memory we are no longer using when the application is off IndicatorRelease(ma_fast_handler); IndicatorRelease(ma_slow_handler); IndicatorRelease(atr_handler); } //+------------------------------------------------------------------+ //| Expert tick function | //+------------------------------------------------------------------+ void OnTick() { //--- When price levels change datetime current_time = iTime("EURUSD",PERIOD_D1,0); static datetime time_stamp; //--- Update the time if(current_time != time_stamp) { time_stamp = current_time; //--- Fetch indicator current readings CopyBuffer(ma_fast_handler,0,0,1,ma_fast_reading); CopyBuffer(ma_slow_handler,0,0,1,ma_slow_reading); CopyBuffer(atr_handler,0,0,1,atr_reading); ask = SymbolInfoDouble("EURUSD",SYMBOL_ASK); bid = SymbolInfoDouble("EURUSD",SYMBOL_BID); //--- If we have no open positions if(PositionsTotal() == 0) { //--- Trading rules if(ma_fast_reading[0] > ma_slow_reading[0]) { //--- Buy signal Trade.Buy(0.01,"EURUSD",ask,ask-(atr_reading[0] * 2),ask+(atr_reading[0] * 2),""); } else if(ma_fast_reading[0] < ma_slow_reading[0]) { //--- Sell signal Trade.Sell(0.01,"EURUSD",bid,bid+(atr_reading[0] * 2),bid-(atr_reading[0] * 2),""); } } } } //+------------------------------------------------------------------+
As explained in the introduction of the article, the original version of the strategy is not profitable and is very noisy, which explains the poor equity curve we observed when backtesting it over the test duration.

Figure 4: Visualizing the equity curve produced by the classical version of the trading strategy
When we move on to the detailed analysis of the strategy, we observe that the majority of the trades placed by the strategy were unprofitable, which is undesirable. Additionally, the strategy has a profit factor less than 1 and an expected payoff less than 0, meaning that it is expected to erode investor capital over time. Under most circumstances, such a strategy would be abandoned and completely replaced. However, let us look for ways to teach old dogs new tricks.
As stated earlier in our article, we want to minimize repetition of the same information; therefore, we focus our attention only on the segments of the code that have changed and ignore the other segments that have not. After multiple tests, changes, and iterations, these adjustments were found to be the most stable improvements that we could manually think of making. By checking whether the extreme wicks of the price formed above or below the respective fast and slow periods, we could find better long or short entries.

Figure 5: The backtest results produced by the original version of our trading strategy
Initial Attempt To Improve The Baseline
Now that we have established a baseline, we can continue working to improve our trading strategy and add more rigorous filters to reduce the number of trades driven by noise. After evaluating multiple different configurations manually, we found reliable results when the extremes of the candle wicks were compared against the moving average indicator itself as a suggestion of market bias. Therefore, when the low wick was above the fast moving average, we were inclined to enter long positions, and when the high wick was beneath the slow moving average, we sought to enter short positions.
//--- If we have no open positions if(PositionsTotal() == 0) { //--- Trading rules if((ma_fast_reading[0] > ma_slow_reading[0]) && (low > ma_fast_reading[0])) { //--- Buy signal Trade.Buy(0.01,"EURUSD",ask,ask-(atr_reading[0] * 2),ask+(atr_reading[0] * 2),""); } else if((ma_fast_reading[0] < ma_slow_reading[0]) && (high < ma_slow_reading[0])) { //--- Sell signal Trade.Sell(0.01,"EURUSD",bid,bid+(atr_reading[0] * 2),bid-(atr_reading[0] * 2),""); } }
The equity curve obtained by the new set of rules is still unstable for the most part. However, it has a dominant upward trend compared with the dominant downward trend observed in the original version of our strategy. If the reader looks more closely at the picture, we can observe that there were times when the account’s floating equity spiked above the final balance recorded when the trade was closed, implying that there is still valuable signal in our trading strategy that we are not yet picking up.

Figure 6: The changes we made to our application brought desirable changes to the equity curve we obtained from the trading logic
When we consider the detailed statistics of our trading strategy, we see considerable improvements. For starters, the total net profit has finally become positive after starting off in a significantly negative state. Additionally, the gross loss accrued over the entire backtest window has fallen, meaning our trading strategy is exposed to less risk than the original version. Furthermore, 135 trades were placed by the noisy version of the strategy, while the profitable version used fewer—specifically 107 trades—to produce more profit. The trading accuracy of our strategy has risen from dominantly unprofitable to marginally profitable, and our expected payoff has improved over the original version.
Figure 7: Our manual improvements to the strategy rectified its negative account balance problem from the initial test we performed
Second Attempt To Surpass The Baseline
With these results in hand, we were then motivated to try again to achieve even better results. However, at this point, our intuition was no longer of much use to us, and therefore we sought to learn better trading rules directly from the market’s historical behavior. We began by creating a script to fetch historical market data and write it to CSV format.
//+------------------------------------------------------------------+
//| ProjectName |
//| Copyright 2020, CompanyName |
//| http://www.companyname.net |
//+------------------------------------------------------------------+
#property copyright "Copyright 2024, MetaQuotes Ltd."
#property link "https://www.mql5.com"
#property version "1.00"
#property script_show_inputs
//--- Define our moving average indicator
#define MA_PERIOD_FAST 30 //--- Moving Average Fast Period
#define MA_PERIOD_SLOW 60 //--- Moving Average Slow Period
#define MA_TYPE MODE_SMA //--- Type of moving average we have
#define HORIZON 5 //--- Forecast horizon
//--- Our handlers for our indicators
int ma_fast_handle,ma_slow_handle;
//--- Data structures to store the readings from our indicators
double ma_fast_reading[],ma_slow_reading[];
//--- File name
string file_name = Symbol() + " Cross Over Data.csv";
//--- Amount of data requested
input int size = 3000;
//+------------------------------------------------------------------+
//| Our script execution |
//+------------------------------------------------------------------+
void OnStart()
{
int fetch = size + (HORIZON * 2);
//---Setup our technical indicators
ma_fast_handle = iMA(_Symbol,PERIOD_CURRENT,MA_PERIOD_FAST,0,MA_TYPE,PRICE_CLOSE);
ma_slow_handle = iMA(_Symbol,PERIOD_CURRENT,MA_PERIOD_SLOW,0,MA_TYPE,PRICE_OPEN);
//---Set the values as series
CopyBuffer(ma_fast_handle,0,0,fetch,ma_fast_reading);
ArraySetAsSeries(ma_fast_reading,true);
CopyBuffer(ma_slow_handle,0,0,fetch,ma_slow_reading);
ArraySetAsSeries(ma_slow_reading,true);
//---Write to file
int file_handle=FileOpen(file_name,FILE_WRITE|FILE_ANSI|FILE_CSV,",");
for(int i=size;i>=1;i--)
{
if(i == size)
{
FileWrite(file_handle,
//--- Time
"Time",
//--- OHLC
"Open",
"High",
"Low",
"Close",
"MA F",
"MA S"
);
}
else
{
FileWrite(file_handle,
iTime(_Symbol,PERIOD_CURRENT,i),
//--- OHLC
iOpen(_Symbol,PERIOD_CURRENT,i),
iHigh(_Symbol,PERIOD_CURRENT,i),
iLow(_Symbol,PERIOD_CURRENT,i),
iClose(_Symbol,PERIOD_CURRENT,i),
ma_fast_reading[i],
ma_slow_reading[i]
);
}
}
//--- Close the file
FileClose(file_handle);
}
//+------------------------------------------------------------------+
//+------------------------------------------------------------------+
//| Undefine system constants |
//+------------------------------------------------------------------+
#undef HORIZON
#undef MA_PERIOD_FAST
#undef MA_PERIOD_SLOW
#undef MA_TYPE
//+------------------------------------------------------------------+ Analyzing Our Market Data in Python
After doing that, we ran the script with Python to analyze the data and learn trading rules from it. We started by importing the standard Python libraries.
#Import the standard python libraries import pandas as pd import numpy as np import matplotlib.pyplot as plt import seaborn as sns
Then we read in the CSV file generated by the script.
#Read in the data data = pd.read_csv("/content/EURUSD Cross Over Data.csv")
We defined how far into the future we wished to forecast with our statistical model.
#Label the data #Define the forecast horizon HORIZON = 5
Next, we then labeled our data with the changes occurring in the price feeds we were interested in. In this particular example, we found that there could be three suitable targets for our model to learn—the primary target being the expected return in the market, and the two additional targets representing the changes in the moving average crossover.
#Define targets data['Target'] = data['Close'].shift(-HORIZON) - data['Close'] data['Target 2'] = data['MA F'].shift(-HORIZON) - data['MA F'] data['Target 3'] = data['MA S'].shift(-HORIZON) - data['MA S'] #Drop missing rows of data data = data.iloc[:-HORIZON,:]From there, we created two separate train and test partitions.
#Separate the test dates train = data.iloc[:(-365*4),:] test = data.iloc[(-365*4):,:]Then we loaded the statistical model of choice.
from sklearn.linear_model import LinearRegressionNow, let us define our time series cross-validation tools.
tscv = TimeSeriesSplit(n_splits=5,gap=HORIZON)We separated the inputs and targets. X = train.iloc[:,1:-3] y = train.iloc[:,-3:]At this step we were almost ready to fit our model and export it to ONNX format. ONNX, which stands for Open Neural Network Exchange, is a globally recognized API that allows us to build and share machine learning models in a framework independent of the training framework used to generate the model.
import onnx from skl2onnx.common.data_types import FloatTensorType from skl2onnx import convert_sklearn
We then initialized the model and fit it on the historical market data.
model = LinearRegression() model.fit(X,y)
After that, we defined the model’s input shape.
initial_types = [('float_input',FloatTensorType([1,X.shape[1]]))]
And then we defined the model’s output shape.
final_types = [('float_output',FloatTensorType([1,3]))]
Lastly, we saved the model as an ONNX prototype.
onnx_proto = convert_sklearn(model,initial_types=initial_types,final_types=final_types,target_opset=12) onnx.save(onnx_proto,'EURUSD Detailed RF.onnx')
Realizing The Improvements in MQL5
Once the model has been saved as an ONNX prototype, it can now be imported into the trading application.
//+------------------------------------------------------------------+ //| Resources | //+------------------------------------------------------------------+ #resource "\\Files\\EURUSD MA.onnx" as const uchar onnx_proto[];
We begin by defining new global variables that will be associated with our ONNX models. These new variables are responsible for handling the inputs and outputs that our model gives us. We will also define a handler for us to run predictions from our model.
//+------------------------------------------------------------------+ //| Global variables | //+------------------------------------------------------------------+ vectorf model_inputs,model_outputs; long model;
Upon initialization, a few important steps must be taken to prepare our model for use in the market. First, we must set up the model from the ONNX buffer that we imported. Then we define the input and output shapes of our model. Our model takes 6 inputs and gives us 3 predictions. From there we define the model’s input and output shapes and then ensure that the model has been created successfully. If our application is no longer in use, we will release the ONNX model to free up memory. Whenever new price levels are received, most of our trading logic remains the same. The few important improvements that need to be made are that we first store all the model inputs that we need into the float vector.
//+------------------------------------------------------------------+ //| Expert initialization function | //+------------------------------------------------------------------+ int OnInit() { //--- Setup the ONNX model model = OnnxCreateFromBuffer(onnx_proto,ONNX_DATA_TYPE_FLOAT); //--- Define the model parameter shape ulong input_shape[] = {1,6}; ulong output_shape[] = {1,3}; OnnxSetInputShape(model,0,input_shape); OnnxSetOutputShape(model,0,output_shape); model_inputs = vectorf::Zeros(6); model_outputs = vectorf::Zeros(3); if(model != INVALID_HANDLE) { return(INIT_SUCCEEDED); } //--- return(INIT_FAILED); }
If the ONNX model is no longer in use, we will release the model to free up memory resources.
//+------------------------------------------------------------------+ //| Expert deinitialization function | //+------------------------------------------------------------------+ void OnDeinit(const int reason) { //--- Free up memory we are no longer using when the application is off OnnxRelease(model); }
After successfully storing the inputs that we need, we then fetch a prediction from our model if we have no trades open, and afterward our model’s prediction will then serve as an additional filter for our trading rules. Therefore, if our model’s forecasted return value exceeds 0, then we are permitted to take long trades, but if our model forecasts negative returns, then we are permitted to take short trades.
//+------------------------------------------------------------------+ //| Expert tick function | //+------------------------------------------------------------------+ void OnTick() { //--- When price levels change datetime current_time = iTime("EURUSD",PERIOD_D1,0); static datetime time_stamp; //--- Update the time if(current_time != time_stamp) { time_stamp = current_time; //--- Fetch indicator current readings CopyBuffer(ma_fast_handler,0,0,1,ma_fast_reading); CopyBuffer(ma_slow_handler,0,0,1,ma_slow_reading); CopyBuffer(atr_handler,0,0,1,atr_reading); double open = iOpen("EURUSD",PERIOD_D1,0); double close = iClose("EURUSD",PERIOD_D1,0); double high = iHigh("EURUSD",PERIOD_D1,0); double low = iLow("EURUSD",PERIOD_D1,0); model_inputs[0] = (float) open; model_inputs[1] = (float) high; model_inputs[2] = (float) low; model_inputs[3] = (float) close; model_inputs[4] = (float) ma_fast_reading[0]; model_inputs[5] = (float) ma_slow_reading[0]; ask = SymbolInfoDouble("EURUSD",SYMBOL_ASK); bid = SymbolInfoDouble("EURUSD",SYMBOL_BID); //--- If we have no open positions if(PositionsTotal() == 0) { if(!(OnnxRun(model,ONNX_DATA_TYPE_FLOAT,model_inputs,model_outputs))) { Comment("Failed to obtain a forecast from our model: ",GetLastError()); } else { Comment("Forecast: ",model_outputs); //--- Trading rules if((ma_fast_reading[0] > ma_slow_reading[0]) && (low > ma_fast_reading[0]) && (model_outputs[0] > 0)) { //--- Buy signal Trade.Buy(0.01,"EURUSD",ask,ask-(atr_reading[0] * 2),ask+(atr_reading[0] * 2),""); } else if((ma_fast_reading[0] < ma_slow_reading[0]) && (high < ma_slow_reading[0]) && (model_outputs[0] < 0)) { //--- Sell signal Trade.Sell(0.01,"EURUSD",bid,bid+(atr_reading[0] * 2),bid-(atr_reading[0] * 2),""); } } } } } //+------------------------------------------------------------------+
When you consider the equity curve produced by our updated version of the strategy, we can immediately see discernible differences. The volatility in the account balance has been greatly controlled. We can see that this new application exercises far better levels of control over the volatility in our account balance, and additionally there is a stronger and clearer uptrend to the account balance, implying that our balance was growing steadily over the three-year practice. Therefore, our new strategy does certainly appear more sound than the initial strategy.

Figure 8: Our trading strategy is now exhibiting a strong and healthy balance over time
When we take a look at the detailed results, we can see that the total net profits have doubled from the initial version that we had just prior. The net profit is now $120.00, and in addition to that, the gross loss accumulated during this test period has fallen by a fraction. Our original version of the strategy accrued a gross loss of $900, while this new version only accrued a loss of $300. And yet our total net profit has increased more than twofold, meaning that this version of our strategy is definitely far more effective. Our expected payoff and profit factors are now finally healthier than the original values. What’s really impressive to see is that the total number of trades has reduced by almost half the number of trades we placed earlier, meaning that we are making significantly more profit with substantially fewer trades.
However, it is quite alarming to observe that the distribution of trades being placed by the application does not reflect a firm grasp of how financial markets should be traded. Our application placed 14 short trades and 45 long trades over the three-year backtest, which is substantially biased and a possible sign that something is fundamentally wrong with the statistical model that we’re using to guide our strategy.

Figure 9: The statistical model we implemented into our strategy introduced its own set of problems
Digging Deeper For Improvements
After giving it much thought, it appeared to me that a practical solution to the problem we are experiencing could be alleviated by trying to use more flexible statistical models. We were using a linear regression model, which is good for establishing a baseline, but it has very strict assumptions about the structure of the relationship. Therefore, we will replace the linear model with random forest regression, which is capable of learning nonlinear effects in the data that our typical linear model could not.
Only a few changes were necessary in the Python notebook, and we focus exclusively on the changes that had to be made. Our new model is therefore defined as a random forest regressor, and then we fit our model and export it to ONNX format.
model = RandomForestRegressor() model.fit(X,y) onnx_proto = convert_sklearn(model,initial_types=initial_types,final_types=final_types,target_opset=12) onnx.save(onnx_proto,'EURUSD Detailed RF.onnx')
Cultivating Room For Growth in MQL5
Afterward, we import the model into our MQL5 application, and we can make a key improvement by replacing our old trading rules with new rules that are completely driven by our new and more flexible statistical model. The random forest model that we have chosen is capable of exploiting more market relationships than the linear model that we started with, and therefore it should, in theory, be able to make all trading decisions for us without having to use the classical trading rules that we initially applied.
//--- Trading rules if(((model_outputs[0] > 0) && (model_outputs[1] > 0) && (model_outputs[2] > 0)) || ((ma_fast_reading[0] > ma_slow_reading[0]) && (low > ma_fast_reading[0]))) { //--- Buy signal Trade.Buy(0.01,"EURUSD",ask,ask-(atr_reading[0] * 2),ask+(atr_reading[0] * 2),""); } else if(((model_outputs[0] < 0) && (model_outputs[1] < 0) && (model_outputs[2] < 0)) || ((ma_fast_reading[0] < ma_slow_reading[0]) && (low < ma_slow_reading[0]))) { //--- Sell signal Trade.Sell(0.01,"EURUSD",bid,bid+(atr_reading[0] * 2),bid-(atr_reading[0] * 2),""); }
When we observe the new equity curve produced by the more powerful nonlinear model, we can see that our new account balance is rising to new highs that we did not attain before the initial test. However, some volatile aspects of the trading strategy’s behavior are beginning to emerge, because as we can see from around December 2022 until about March 2024, the trading strategy was in an intense drawdown. Although it did eventually recover, it is quite disappointing to see, because the initial version of the trading strategy just before this did not suffer from this great volatile period. But all in all, the account still does have positive trends in the balance, which means that the strategy is sound.

Figure 10: Using a non-linear statistical model, helped us attain new levels of performance from the same strategy
When we consider the detailed results, we can see the effects of the changes we have made. The total net profit is more or less the same as what we had initially. However, the gross loss has now shot back up and increased by more than twofold, and the number of total trades placed has also increased. Although the total profit is marginally improving, this new version of our application is doing substantially more work to obtain more or less the same results we had before. However, the key improvement is that when we now look at the distribution of trades, it now reflects the nature of the market, because in the previous version it did not.

Figure 11: The new non-linear supervised model, corrected the bias that was learned by the classical linear model
Final Attempts
We will now take a final attempt to improve our trading strategy to new levels of performance. To do so, I considered carefully fetching more data on the same market. This can be achieved by creating new features that were not originally included in the dataset. Therefore, we manually defined many features that capture the growth happening across primary market feeds. By calculating the differences between the moving average indicators, between price and these moving averages, between price across all four of its channel feeds, and so on, we managed to define about 20 inputs, whereas originally we only had 6.
//+------------------------------------------------------------------+ //| ProjectName | //| Copyright 2020, CompanyName | //| http://www.companyname.net | //+------------------------------------------------------------------+ #property copyright "Copyright 2024, MetaQuotes Ltd." #property link "https://www.mql5.com" #property version "1.00" #property script_show_inputs //--- Define our moving average indicator #define MA_PERIOD_FAST 30 //--- Moving Average Fast Period #define MA_PERIOD_SLOW 60 //--- Moving Average Slow Period #define MA_TYPE MODE_SMA //--- Type of moving average we have #define HORIZON 5 //--- Forecast horizon //--- Our handlers for our indicators int ma_fast_handle,ma_slow_handle; //--- Data structures to store the readings from our indicators double ma_fast_reading[],ma_slow_reading[]; //--- File name string file_name = Symbol() + " Cross Over Data.csv"; //--- Amount of data requested input int size = 3000; //+------------------------------------------------------------------+ //| Our script execution | //+------------------------------------------------------------------+ void OnStart() { int fetch = size + (HORIZON * 2); //---Setup our technical indicatorsa ma_fast_handle = iMA(_Symbol,PERIOD_CURRENT,MA_PERIOD_FAST,0,MA_TYPE,PRICE_CLOSE); ma_slow_handle = iMA(_Symbol,PERIOD_CURRENT,MA_PERIOD_SLOW,0,MA_TYPE,PRICE_OPEN); //---Set the values as series CopyBuffer(ma_fast_handle,0,0,fetch,ma_fast_reading); ArraySetAsSeries(ma_fast_reading,true); CopyBuffer(ma_slow_handle,0,0,fetch,ma_slow_reading); ArraySetAsSeries(ma_slow_reading,true); //---Write to file int file_handle=FileOpen(file_name,FILE_WRITE|FILE_ANSI|FILE_CSV,","); for(int i=size;i>=1;i--) { if(i == size) { FileWrite(file_handle, //--- Time "Time", //--- OHLC "Open", "High", "Low", "Close", //--- Moving averages "MA F", "MA S", //--- Growth in OHLC channels "Delta O", "Delta H", "Delta L", "Delta C", //--- Growth in MA Channels "Delta MA F", "Delta MA S", //--- Growth Across OHLC Channels "Delta O - H", "Delta O - L", "Delta O - C", "Delta H - L", "Delta H - C", "Delta L - C", //--- Growth Between Price and the moving averages "Delta C - MA F", "Delta C - MA S" ); } else { FileWrite(file_handle, iTime(_Symbol,PERIOD_CURRENT,i), //--- OHLC iOpen(_Symbol,PERIOD_CURRENT,i), iHigh(_Symbol,PERIOD_CURRENT,i), iLow(_Symbol,PERIOD_CURRENT,i), iClose(_Symbol,PERIOD_CURRENT,i), //--- Moving Averages ma_fast_reading[i], ma_slow_reading[i], //--- Growth in OHLC channels iOpen(_Symbol,PERIOD_CURRENT,i) - iOpen(_Symbol,PERIOD_CURRENT,i+HORIZON), iHigh(_Symbol,PERIOD_CURRENT,i) - iHigh(_Symbol,PERIOD_CURRENT,i+HORIZON), iLow(_Symbol,PERIOD_CURRENT,i) - iLow(_Symbol,PERIOD_CURRENT,i+HORIZON), iClose(_Symbol,PERIOD_CURRENT,i) - iClose(_Symbol,PERIOD_CURRENT,i+HORIZON), //--- Growth in MA Channels ma_fast_reading[i] - ma_fast_reading[i+HORIZON], ma_slow_reading[i] - ma_slow_reading[i+HORIZON], //--- Growth across OHLC channels iOpen(_Symbol,PERIOD_CURRENT,i+HORIZON) - iHigh(_Symbol,PERIOD_CURRENT,i+HORIZON), iOpen(_Symbol,PERIOD_CURRENT,i+HORIZON) - iLow(_Symbol,PERIOD_CURRENT,i+HORIZON), iOpen(_Symbol,PERIOD_CURRENT,i+HORIZON) - iClose(_Symbol,PERIOD_CURRENT,i+HORIZON), iHigh(_Symbol,PERIOD_CURRENT,i+HORIZON) - iLow(_Symbol,PERIOD_CURRENT,i+HORIZON), iHigh(_Symbol,PERIOD_CURRENT,i+HORIZON) - iClose(_Symbol,PERIOD_CURRENT,i+HORIZON), iLow(_Symbol,PERIOD_CURRENT,i+HORIZON) - iClose(_Symbol,PERIOD_CURRENT,i+HORIZON), //--- Growth between price and the moving averages iClose(_Symbol,PERIOD_CURRENT,i+HORIZON) - ma_fast_reading[i+HORIZON], iClose(_Symbol,PERIOD_CURRENT,i+HORIZON) - ma_slow_reading[i+HORIZON] ); } } //--- Close the file FileClose(file_handle); } //+------------------------------------------------------------------+ //+------------------------------------------------------------------+ //| Undefine system constants | //+------------------------------------------------------------------+ #undef HORIZON #undef MA_PERIOD_FAST #undef MA_PERIOD_SLOW #undef MA_TYPE //+------------------------------------------------------------------+
Analyzing The Data In Python
After doing that, we load our standard Python libraries.
from sklearn.linear_model import LinearRegression from sklearn.ensemble import RandomForestRegressor from sklearn.model_selection import TimeSeriesSplit,cross_val_score
And then, we create our time-series cross-validation object.
tscv = TimeSeriesSplit(n_splits=5,gap=HORIZON) We then create a new function called get_model, and this returns to us a fresh instance of the same random forest model we used earlier.
def get_model(): return(RandomForestRegressor())
Now we would like to measure carefully, the improvements being made by all the new features we’ve created. Therefore, we define two inputs: one being all the inputs that we have, and the second being the classical inputs — the open, high, low, close, and the moving average. Then we define the targets.
X = train.iloc[:,1:-3] X_classic = train.iloc[:,1:7] y = train.iloc[:,-3:]
And now we want to measure the improvements made at predicting each target. Therefore, we create an array to store our performance on each target, and then on each array we add the original performance that we obtained when using the classical data and the new performance levels attained by using all the data we fetched.
target_1 = [] target_2 = [] target_3 = []
As we can see, when predicting the future EURUSD returns, the detailed market data that we fetched improved our performance levels significantly. Our error levels appear to drop by more than half.
target_1.append(np.mean(np.abs(cross_val_score(get_model(),X_classic,y.iloc[:,0],cv=tscv,scoring='neg_mean_squared_error')))) target_1.append(np.mean(np.abs(cross_val_score(get_model(),X,y.iloc[:,0],cv=tscv,scoring='neg_mean_squared_error'))))

Figure 12: Our detailed market data helped us better forecast future EURUSD returns out of sample
The same is also true for our fast moving average, which was the 30-period moving average. The new market data that we curated by hand improved our error levels to unbelievably low values — it is a significant improvement in our root-mean-squared error.
target_2.append(np.mean(np.abs(cross_val_score(get_model(),X_classic,y.iloc[:,1],cv=tscv,scoring='neg_mean_squared_error')))) target_2.append(np.mean(np.abs(cross_val_score(get_model(),X,y.iloc[:,1],cv=tscv,scoring='neg_mean_squared_error'))))

Figure 13: We also observed reduced error rates when we employed more detailed market data to forecast the future value of the 30-Period moving average
Unfortunately, the slow moving average, which was the 60-period moving average, did not benefit as much from the detailed data that we generated by hand. So it appears that only the first two targets benefited the most from the efforts that we made.
target_3.append(np.mean(np.abs(cross_val_score(get_model(),X_classic,y.iloc[:,2],cv=tscv,scoring='neg_mean_squared_error')))) target_3.append(np.mean(np.abs(cross_val_score(get_model(),X,y.iloc[:,2],cv=tscv,scoring='neg_mean_squared_error'))))

Figure 14: The 60 period moving average remained difficult to forecast and benefited only marginally from the detailed market data we created
Implementing Our Improvements in MQL5
From there, we now have to change a few things in our application. For starters, the input shape of our application has to be changed. It was initially 6, and now we have to set it to 20.
//+------------------------------------------------------------------+ //| Expert initialization function | //+------------------------------------------------------------------+ int OnInit() { //--- Setup our indicators ma_fast_handler = iMA("EURUSD",PERIOD_D1,30,0,MODE_SMA,PRICE_CLOSE); ma_slow_handler = iMA("EURUSD",PERIOD_D1,60,0,MODE_SMA,PRICE_CLOSE); atr_handler = iATR("EURUSD",PERIOD_D1,14); //--- Setup the ONNX model model = OnnxCreateFromBuffer(onnx_proto,ONNX_DATA_TYPE_FLOAT); //--- Define the model parameter shape ulong input_shape[] = {1,20}; ulong output_shape[] = {1,3}; OnnxSetInputShape(model,0,input_shape); OnnxSetOutputShape(model,0,output_shape); model_inputs = vectorf::Zeros(20); model_outputs = vectorf::Zeros(3); if(model != INVALID_HANDLE) { return(INIT_SUCCEEDED); } //--- return(INIT_FAILED); }
Additionally, when updated price levels are received, we now have a lot more features to feed into our model’s inputs, but the way we handle the model’s outputs would be more or less the same.
//--- Update the time if(current_time != time_stamp) { time_stamp = current_time; //--- Fetch indicator current readings CopyBuffer(ma_fast_handler,0,0,10,ma_fast_reading); CopyBuffer(ma_slow_handler,0,0,10,ma_slow_reading); CopyBuffer(atr_handler,0,0,10,atr_reading); double open = iOpen("EURUSD",PERIOD_D1,0); double close = iClose("EURUSD",PERIOD_D1,0); double high = iHigh("EURUSD",PERIOD_D1,0); double low = iLow("EURUSD",PERIOD_D1,0); model_inputs[0] = (float) open; model_inputs[1] = (float) high; model_inputs[2] = (float) low; model_inputs[3] = (float) close; model_inputs[4] = (float) ma_fast_reading[0]; model_inputs[5] = (float) ma_slow_reading[0]; model_inputs[6] = (float) (iOpen(_Symbol,PERIOD_CURRENT,0) - iOpen(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[7] = (float) (iHigh(_Symbol,PERIOD_CURRENT,0) - iHigh(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[8] = (float) (iLow(_Symbol,PERIOD_CURRENT,0) - iLow(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[9] = (float) (iClose(_Symbol,PERIOD_CURRENT,0) - iClose(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[10] = (float) (ma_fast_reading[0] - ma_fast_reading[0+HORIZON]); model_inputs[11] = (float) (ma_slow_reading[0] - ma_slow_reading[0+HORIZON]); model_inputs[12] = (float) (iOpen(_Symbol,PERIOD_CURRENT,0+HORIZON) - iHigh(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[13] = (float) (iOpen(_Symbol,PERIOD_CURRENT,0+HORIZON) - iLow(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[14] = (float) (iOpen(_Symbol,PERIOD_CURRENT,0+HORIZON) - iClose(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[15] = (float) (iHigh(_Symbol,PERIOD_CURRENT,0+HORIZON) - iLow(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[16] = (float) (iHigh(_Symbol,PERIOD_CURRENT,0+HORIZON) - iClose(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[17] = (float) (iLow(_Symbol,PERIOD_CURRENT,0+HORIZON) - iClose(_Symbol,PERIOD_CURRENT,0+HORIZON)); model_inputs[18] = (float) (iClose(_Symbol,PERIOD_CURRENT,0+HORIZON) - ma_fast_reading[0+HORIZON]); model_inputs[19] = (float) (iClose(_Symbol,PERIOD_CURRENT,0+HORIZON) - ma_slow_reading[0+HORIZON]); ask = SymbolInfoDouble("EURUSD",SYMBOL_ASK); bid = SymbolInfoDouble("EURUSD",SYMBOL_BID); //--- If we have no open positions if(PositionsTotal() == 0) { if(!(OnnxRun(model,ONNX_DATA_TYPE_FLOAT,model_inputs,model_outputs))) { Comment("Failed to obtain a forecast from our model: ",GetLastError()); } else { Comment("Forecast: ",model_outputs); //--- Trading rules if(((model_outputs[0] > 0) && (model_outputs[1] > 0) && (model_outputs[2] > 0)) || ((ma_fast_reading[0] > ma_slow_reading[0]) && (low > ma_fast_reading[0]))) { //--- Buy signal Trade.Buy(0.01,"EURUSD",ask,ask-(atr_reading[0] * 2),ask+(atr_reading[0] * 2),""); } else if(((model_outputs[0] < 0) && (model_outputs[1] < 0) && (model_outputs[2] < 0)) || ((ma_fast_reading[0] < ma_slow_reading[0]) && (low < ma_slow_reading[0]))) { //--- Sell signal Trade.Sell(0.01,"EURUSD",bid,bid+(atr_reading[0] * 2),bid-(atr_reading[0] * 2),""); } } } } }
Finally, when we consider the equity curve brought about by our new “big-data” approach to analyzing the market, we can unfortunately see that a lot of noise has been introduced into our system, and the system is no longer profitable. It is volatile, and it has lost its positive uptrend.

Figure 15: The new equity curve we have produced is far too volatile
Additionally, when we consider the detailed statistical analysis of our performance, we see that our performance has deteriorated. We are now suffering from the same problem of biased trade entries that are biased towards long entries, and additionally the expected payoff is now back to being negative, and our total net profit is also negative. Therefore, we can clearly see that the previous version — version four of our application — was the best version that we made so far in this discussion.

Figure 16: A detailed analysis of the results brought about by the final version of our application
Conclusion
In conclusion, this article has demonstrated to the reader how strategies that are normally considered outdated and too widely exploited to be profitable can be improved. Contrary to popular discussions, these strategies can carefully be improved and reimagined to new levels of performance. By carefully identifying the weaknesses of the strategy, we then have useful leads on the necessary improvements to make. This article has taught the reader how to help reduce the amount of noise that is penetrating your strategy, and hopefully identify some areas of your private classical strategies that can still be used today with a bit of renewed effort on the reader's part.
Lastly, this article has also shown one of the pitfalls of classical supervised machine learning. As we discussed in our related series of articles, Overcoming The Limitations of AI, the error metrics that we use to measure the performance of statistical models, are not necessarily interoperable with the performance metrics we care about as algorithmic traders. Readers who need a refresher on that previous discussion can find a link attached, here.
For returning readers, the dangers of blindly trusting RMSE are becoming clear to us. In our analysis of the detailed market data, we observed significant improvements in out of sample RMSE when forecasting EURUSD returns and the 30 period SMA. But these material improvements in out of sample RMSE brought about unwanted effects on our profitability. Therefore, the reader should walk away better informed of the limitations of classical supervised statistical learning.
| File Name | File Description |
|---|---|
| Fetch Data.mq5 | The script we wrote to fetch basic market data on the EURUSD exchange rates (4 OHLC price feeds, 2 moving averages). |
| Fetch Data 2.mq5 | The script we wrote to fetch detailed market data on the EURUSD exchange rates (20 columns). |
| MA_Crossover_V1.mq5 | The most widely recognized version of the moving average crossover strategy we implemented to establish baseline performance levels. |
| MA_Crossover_V2.mq5 | The best manual improvements we could imagine that improved the strategy's long term performance. |
| MA_Crossover_V3.mq5 | This version of our strategy was guided by a simple statistical model, but learned a bias for long entries. |
| MA_Crossover_V4.mq5 | The best version of our trading strategy that we built together, it corrected the bias of the previous version without losing profitability. |
| MA_Crossover_V5.mq5 | The final version of our trading strategy that we built using large and detailed observations of EURUSD exchange rates. |
| Advanced_Moving_Averages.ipynb | The Jupyternotebook we used to analyze our market data. |
Warning: All rights to these materials are reserved by MetaQuotes Ltd. Copying or reprinting of these materials in whole or in part is prohibited.
This article was written by a user of the site and reflects their personal views. MetaQuotes Ltd is not responsible for the accuracy of the information presented, nor for any consequences resulting from the use of the solutions, strategies or recommendations described.
From Novice to Expert: Developing a Geographic Market Awareness with MQL5 Visualization
Statistical Arbitrage Through Cointegrated Stocks (Part 8): Rolling Windows Eigenvector Comparison for Portfolio Rebalancing
Chaos Game Optimization (CGO)
Currency pair strength indicator in pure MQL5
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use