Examples: Using Neural Networks In MetaTrader - page 4

 
arnab:

I tried to test the attached EAs. I tested the EA for the period 1.1.2007 to 1.1.2009. First I ran the EA with SaveAnn=true and NeuroFilter=false and then I ran the EA with SaveAnn=false and NeuroFilter=true. I got the same result. Am I doing anything wrong?


Thanks

Arnab

You should also set the AnnsNumber to 30 when put on SaveAnn.

But i'm not sure it is the cause of your problem.

 
traderkmi:

Hi.


How to modify this EA in order to work it toghether with MACD and another indicator, in example MACD+MA ?


In example, modify it to work not only with MACD but also with MA, something in example like this :


When MA 's value of last bar is more than MA last bar -1 (shift back 1 bar) it is a BUY signal,

in example BAR 0 : MA(40) 1.4000; BAR -1 : MA(40) 1.3980 --> Buy signal


when MA's value of the last bar is less than MA last bar -1 (shift back 1 bar) it is a SELL signal,

in example BAR 0 : MA(40) 1.3727; BAR -1 : MA(40) 1.3735 --> Sell signal


MA in example 40 period, close price, exponential.


What to add/modify in the original code of this EA in order to have

both MACD + MA signals before launch the order?


In example, launching a BUY order only when MACD get buy signal AND when MA get also BUY signal.


I hope to be clear.


Thanks for insights in advance.


Hi.


I've tried to do those modify, but i don't know if it is logically correct. Can you insight me?


void
ann_prepare_input ()
{
    int i;
    for (i = 0; i <= AnnInputs - 1; i = i + 4 ) { //3) {
	InputVector[i] =
	    10 * iMACD (NULL, 0, FastMA, SlowMA, SignalMA, PRICE_CLOSE,
			MODE_MAIN, i * 4); //3);
	InputVector[i + 1] =
	    10 * iMACD (NULL, 0, FastMA, SlowMA, SignalMA, PRICE_CLOSE,
			MODE_SIGNAL, i * 4); //3);
	InputVector[i + 2] = InputVector[i - 2] - InputVector[i - 1];
	
	InputVector[i + 3] = iMA(NULL, 0 , 40 ,0,MODE_EMA,PRICE_CLOSE, i*4); //new vector for MA purpose
     }
}



And this following other one:

    /* Calulate last and previous MACD values.
     * Lag one bar as current bar is building up
     */
    double MacdLast = iMACD (NULL, 0, FastMA, SlowMA, SignalMA, PRICE_CLOSE,
			     MODE_MAIN, 1);
    double MacdPrev = iMACD (NULL, 0, FastMA, SlowMA, SignalMA, PRICE_CLOSE,
			     MODE_MAIN, 2);
    double SignalLast = iMACD (NULL, 0, FastMA, SlowMA, SignalMA, PRICE_CLOSE,
			       MODE_SIGNAL,
			       1);
    double SignalPrev = iMACD (NULL, 0, FastMA, SlowMA, SignalMA, PRICE_CLOSE,
			       MODE_SIGNAL,
			       2);
       
       
       // Added code start
    double MaLast = iMA(NULL,0,40,0,MODE_EMA,PRICE_CLOSE,1);			       
    
    double MaPrev = iMA(NULL,0,40,0,MODE_EMA,PRICE_CLOSE,2);
       // Addedo code end
       
    /* BUY signal */
    if (MacdLast > SignalLast && MacdPrev < SignalPrev 
    
    && MaLast>MaPrev) { // Added code
	BuySignal = true;
    }
    /* SELL signal */
    if (MacdLast < SignalLast && MacdPrev > SignalPrev 
   
    && MaLast<MaPrev) {  // Added code
	SellSignal = true;
    }


This one too: (i changed 3 with 4)

init ()
{
    int i, ann;
    if (!is_ok_period (PERIOD_M5)) {
	debug (0, "Wrong period!");
	return (-1);
    }
    AnnInputs = (AnnInputs / 4) * 4;	// Make it integer divisible by 3
    if (AnnInputs < 4) {
	debug (0, "AnnInputs too low!");
    }


And my question is:


It is correct?


How to (if should be done) modify: ???

// Compute MagicNumber and AnnPath
    MagicNumber += (SlowMA + 256 * FastMA + 65536 * SignalMA);
    AnnPath = StringConcatenate (ANN_PATH, NAME, "-", MagicNumber);

In order to have the MA in consideration here, if it needs to be added here?


And in ann_wise_long () and ann_wise_short () declaration functions, in the body of the functions,

i should be changed now? In example

for (i = 0; i < AnnsNumber; i += 2) {

to

for (i = 0; i < AnnsNumber; i += 3) {


and


ret = 2 * ret / AnnsNumber;

to

ret = 3 * ret / AnnsNumber;


Thanks for any partecipations is this my questions

 

Hi all

I'm getting a critical error when trying to optimize with Neurofilter=false, SaveANN=false, AnnsNumber=0

My terminal built is 224

Any help?

Many thanks

 

Please disregard my comment about crash

My .set file had a error

I had ANNINPUT=0 the correct is ANNNUMBER=0

 

This is great work! I was able to follow your directions, very clear. However, I have a problem. When I run the tester with NeuroFilter on for training data period 2007.12.31 to 2009.01.01, it works just fine. But when I try to run it for the test period 2009.01.01-2009.03.22 the results are the same both with NeuroFilter on and NeuroFilter off.

Here is the process I am running:

1) Run EA for training AND test periods to gather and record pre-ANN results for reference with AnnsNumber=0,SaveAnn=FALSE, and NeuroFilter=FALSE.

2) Run EA through tester for training period with AnnsNumber=30, SaveAnn=TRUE, and NeuroFilter=FALSE. This populates the C:\ANN folder with the ann files.

3) Turn NeuroFilter=TRUE, SaveAnn=FALSE, and AnnsNumber=2 (your example parameters, not sure why 2?) and run for TRAINING period. Good improvement in results, it is working!

4) With NeuroFilter=TRUE, SaveAnn=False, and AnnsNumber=2, run for TEST period. NO CHANGE IN RESULTS.....

Would you please help me?

 
fxorbust:

This is great work! I was able to follow your directions, very clear. However, I have a problem. When I run the tester with NeuroFilter on for training data period 2007.12.31 to 2009.01.01, it works just fine. But when I try to run it for the test period 2009.01.01-2009.03.22 the results are the same both with NeuroFilter on and NeuroFilter off.

Here is the process I am running:

1) Run EA for training AND test periods to gather and record pre-ANN results for reference with AnnsNumber=0,SaveAnn=FALSE, and NeuroFilter=FALSE.

2) Run EA through tester for training period with AnnsNumber=30, SaveAnn=TRUE, and NeuroFilter=FALSE. This populates the C:\ANN folder with the ann files.

3) Turn NeuroFilter=TRUE, SaveAnn=FALSE, and AnnsNumber=2 (your example parameters, not sure why 2?) and run for TRAINING period. Good improvement in results, it is working!

4) With NeuroFilter=TRUE, SaveAnn=False, and AnnsNumber=2, run for TEST period. NO CHANGE IN RESULTS.....

Would you please help me?


Well, I should mention that I had this problem running the FIXED version of the EA. When I run the original version, I do get different and better results for the test period. What is it about the fixed version that is not working for the test period?
 
fxorbust:

Well, I should mention that I had this problem running the FIXED version of the EA. When I run the original version, I do get different and better results for the test period. What is it about the fixed version that is not working for the test period?
Both versions are working yet the skewed version is performing better due to the trend. However this article is not about how to make a best profitable EA but how to use NN (namely FANN) in MQL4. This is just an example of application of NN however there's plenty of other applications (price forecasting, pattern recognition, reinforcement learning and so on). Try to find your best approach :)
 

OK BIG $20 million dollar question here: How do we set this up to build an NN from an OPTIMIZATION run? For example, right now we are building an NN and training from 1 single run. However, I just ran an optimization run of 10,000 passes and I would love to be able to train and build the NN from all the passes! I should let you know that my filter signal is different than my EA signals so changing the parameters for the optimization run will not affect things. In other words I am using stochastics for the EA and MACD to build the NN.

The end result will be that the MACD will filter the Stochastic signals based on the NN built from 10,000 passes. As it stands now this tool will only make the NN files from a single run. How do we get it to continually build itself and train over the enitre optimization run?

 
fxorbust:

OK BIG $20 million dollar question here: How do we set this up to build an NN from an OPTIMIZATION run? For example, right now we are building an NN and training from 1 single run. However, I just ran an optimization run of 10,000 passes and I would love to be able to train and build the NN from all the passes! I should let you know that my filter signal is different than my EA signals so changing the parameters for the optimization run will not affect things. In other words I am using stochastics for the EA and MACD to build the NN.

The end result will be that the MACD will filter the Stochastic signals based on the NN built from 10,000 passes. As it stands now this tool will only make the NN files from a single run. How do we get it to continually build itself and train over the enitre optimization run?

You're wrong. If you set the SaceAnn to true this EA will save the network at the end of run. The function ann_load() is looking for a network with the given name first if not found creates new one so the next run it will load the network saved in previous run and so on, preserving all the information stored in the network among the runs.
 
emsi:
fxorbust:

OK BIG $20 million dollar question here: How do we set this up to build an NN from an OPTIMIZATION run? For example, right now we are building an NN and training from 1 single run. However, I just ran an optimization run of 10,000 passes and I would love to be able to train and build the NN from all the passes! I should let you know that my filter signal is different than my EA signals so changing the parameters for the optimization run will not affect things. In other words I am using stochastics for the EA and MACD to build the NN.

The end result will be that the MACD will filter the Stochastic signals based on the NN built from 10,000 passes. As it stands now this tool will only make the NN files from a single run. How do we get it to continually build itself and train over the enitre optimization run?

You're wrong. If you set the SaceAnn to true this EA will save the network at the end of run. The function ann_load() is looking for a network with the given name first if not found creates new one so the next run it will load the network saved in previous run and so on, preserving all the information stored in the network among the runs.

OK thanks for the insight. I noticed the file size does not change though, even if I train the NN on an optimization run of 10,000 passes. Shouldn't the file grow as the NN is built from many passes?
Reason: