Metatester giving different results on each run for SINGLE TESTS AND OPTIMIZATIONS maintaining same parameters.
We identified an issue with MetaTester, when we run a Single Test once and then we run the same Single Test without changing anything it gives different results.
This problem is also happening with Optimizations.
We are sending a ZIP File with all the relevant data.
This problem is new with one of the latest builds (1010+). In the past builds we didn't had this problem. We do a lot of heavy optimization and this is definitely a very big issue, because if you can't trust the results you get on single tests and optimization (not being able to reproduce them time after time), there's really little point in trying back tests and all. Does anyone noticed this? Does anyone have a possible solution??
We already contacted ServiceDesk with no luck.
Thanks

- www.mql5.com
Could be down to inconsistent price data, if you are using the cloud...

- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use
We identified an issue with MetaTester, when we run a Single Test once and then we run the same Single Test without changing anything it gives different results.
This problem is also happening with Optimizations.
We are sending a ZIP File with all the relevant data.
This problem is new with one of the latest builds (1010+). In the past builds we didn't had this problem. We do a lot of heavy optimization and this is definitely a very big issue, because if you can't trust the results you get on single tests and optimization (not being able to reproduce them time after time), there's really little point in trying back tests and all. Does anyone noticed this? Does anyone have a possible solution??
We already contacted ServiceDesk with no luck.
Thanks