Discussing the article: "MQL5 Wizard Techniques you should know (Part 28): GANs Revisited with a Primer on Learning Rates"

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Check out the new article: MQL5 Wizard Techniques you should know (Part 28): GANs Revisited with a Primer on Learning Rates.
The Learning Rate, is a step size towards a training target in many machine learning algorithms’ training processes. We examine the impact its many schedules and formats can have on the performance of a Generative Adversarial Network, a type of neural network that we had examined in an earlier article.
The format of this article is going to differ somewhat from what we’ve been used to in prior articles. When presenting each learning rate format, its strategy testing reports will accompany it. This contrasts slightly with what we’ve had before, where the reports typically all came at the end of the article, prior to the conclusion. So, this is an exploratory format that keeps an open mind to the potential or not learning rates have on the performance of machine learning algorithms, or more specifically GANs. Because we are looking at multiple types and formats of learning rates it is important to have uniform testing metrics and that’s why we will use a single symbol, time frame and testing period throughout all the learning rate types.
Based on this, our symbol throughout will be EURJPY, the time frame will be the daily and the test period will be the year 2023. We are testing on a GAN and its default architecture is certainly a factor. There is always the argument that a more elaborate design in terms of number and size of each layer is paramount, however while those are all important considerations, our focus here is the learning rate. To that end, our GANs will relatively simple with just 3 layers that include one hidden layer. Overall sizing of each will be 5-8-1 from input towards output. The settings for these are indicated in the attached code and can easily be modified by the reader if he wishes to use an alternative setting.
Author: Stephen Njuki