optimising gives different results each time I do it

 

So the same data, the same times, and the same EA. I optimise it, record the data, test again with different dates, then go back to the original dates and test again and the resulting data from the first and third test is different. It should be the same. The data is the same and the test is the same. All settings are the same so why would it be different.

It makes my testing seem kind of pointless

Help please :)

Thanks

 

Hello "cciguy",

In the beginning I had the same problem until I realized that it was the spread that was changing the outcome.

If you have a broker with variable spreads, each time you run the strategy tester with the option "Spread: Current" it uses the spread at that instance. When you run it on another occasion where the internal cache file is regenerated and the current spread differs from the last time, the results are different.

What I do is set the Spread to a fixed amount based on the average spread of the broker. For example, if you are testing on EURUSD and the broker's average spread 2.0 pips, then I set it to 20 (for 5 digit broker) or 2 (for 4 digit broker). You don't have to just accept the values on the pull-down list, you can also type in a specific value (for example 13 for 1.3 pips on 5 digit broker).

If you don't know what the average spread for your broker is, you can use a fixed assumed value or use my "Spread Tracker" to find out. Just run it over a reasonable time period to get an idea of maximum, minimum and average values of the spread for a particular symbol. I usually run it on a M1 chart and leave running for an entire week. That way I can also see the high values that the spread assumes during the opening and the closing of the week. You can also run it during critical or important news events and see how the spread is affected during those periods.

Regards,
FMIC

 

Thanks FMIC

I tried that already. And I tried testing while off line and it was the same. Still inconsistant.

 
cciguy: I tried that already. And I tried testing while off line and it was the same. Still inconsistant.
Do you also get inconsistent results when Optimizing Default EA like Macd_Sample?
 
cciguy:

So the same data, the same times, and the same EA. I optimise it, record the data, test again with different dates, then go back to the original dates and test again and the resulting data from the first and third test is different. It should be the same. The data is the same and the test is the same. All settings are the same so why would it be different.

It makes my testing seem kind of pointless

Read this thread, especially the last two pages: https://www.mql5.com/en/forum/148577
 
ubzen:
Do you also get inconsistent results when Optimizing Default EA like Macd_Sample?


yes I do
 
RaptorUK:
Read this thread, especially the last two pages: https://www.mql5.com/en/forum/148577


I disconnected from the internet and it was the same. Different results each time. And I tried the brokers default eas as well. I guess this is not the same for everyone then?

 
cciguy:


I disconnected from the internet and it was the same. Different results each time. And I tried the brokers default eas as well. I guess this is not the same for everyone then?

You are doing something wrong . . .
 
  1. Try deleting all FXT cache files in the "tester\history" folder and starting over.
  2. Also, check your Windows Logs, and make sure you are not having any disk drive problems, such as bad clusters, that may be corrupting data files.
  3. Check if there is not perhaps any malware that is interfering with normal functionality.
  4. If all else fails, try the old "cliché" of rebooting your machine.
 

tried those and it still produces different results each time I test

 
cciguy:

tried those and it still produces different results each time I test


Are you setting or specifying a specific date range or are you just leaving it disabled or including today's date in the range?

If you are, then that it is the expected behaviour, because new data is always being added and when you run the test, results will be different.

If this is the case, then set a specific date range that does not include today's date. That way any new data will not affect the test.

Reason: