
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
all you said makes complete sense to me. And as for "higher memory functions". i do remember seeing memory mentioned many times in more advanced coding threads.
yeahh, in mql5 we need to use simple but effective memory management... that's why this language is much faster than pinescript. It's almost like a poor man's C++ :D
it is definitely under control, but 1gb per instrument? thats nuthin! check out the images i posted in the 2nd comment. 5gb per thread, but then I also was using "every tick based on real ticks" and i was also using a AUD account to test EURGBP. It is once I changed the AUD currency to EUR that it dropped to 500 - 650mb per thread. But the memory never increased but did decrease at least 100 - 250mb until the start of the next testing iteration as it is called in the journal, and then memory went back up to the 650mb again and gradually feel down again until the next iternation started.
But that's also a theory... Need to test...
It's a shame that the MQL5 developers themselves never answer users' questions and we have to discover everything ourselves...
i would almost be willing to bet that even the devs dont know
And each ea tested in optimiser uses different memory amounts, and this seems to vary on how many inputs you are testing, albeit seems to be only 50 - 100mb (each thread) difference between testing 2,3,4,5 inputs.
I mean the result, it doesn't matter if this optimizer is a visually visible GUI that we have to run or if we get the same optimized values by processing the data stream...
I can't comment - if 1 GB per core stably maintains its value without changing the hard drive capacity, everything would seem to be maybe fine? Although 1 GB per instrument seems like a lot to me. But then the strategy tester is built that way and it would be wrong if it accumulated over each time massively. So, I believe that the main thing is that the data volume remains uniform when using the strategy tester and it does not grow massively. In this case, everything seems to be under control. There is no point in picking at this point why so much memory is used, the main thing is that the memory usage does not grow over time.
However, it would be interesting to test with programs optimized for small memory usage and speed to see if the memory usage in the optimizer is still that high. I'll try it out soon and let you know. 1GB per instrument seems still crazy for me...
It could also be that the tester itself stores 1 GB of data in memory at a time for one instrument because it would also be unthinkable if too little data was loaded into memory and disk I/O was constantly being used to update the data.
But that's also a theory... Need to test...
It's a shame that the MQL5 developers themselves never answer users' questions and we have to discover everything ourselves...
its not per instrument its per core and each core is running 3 separate symbols across 3 years so its more like 0.33GB of RAM per symbol for three years