Discussing the article: "Data Science and Machine Learning (Part 21): Unlocking Neural Networks, Optimization algorithms demystified"
The problem turned out to be that the script did not find the file with the training data. But in any case, the program should handle such a case if the data file is not found.
But now there is such a problem:
2024.11.21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 50 undeleted dynamic objects found:
2024.11.21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 10 objects of class 'CTensors'
2024.11.21 17:21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 40 objects of class 'CMatrix'
2024.11.21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 14816 bytes of leaked memory found
The problem turned out to be that the script did not find the data file for training. But, in any case, the programme should handle such a case, if the data file is not found.
But now there is such a problem:
2024.11.21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 50 undeleted dynamic objects found:
2024.11.21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 10 objects of class 'CTensors'
2024.11.21 17:21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 40 objects of class 'CMatrix'
2024.11.21 17:27:37.038 Optimisation Algorithms testScript (EURUSD,M1) 14816 bytes of leaked memory found
This is because only one "fit" function should be called for one instance of a class. I have called multiple fit functions, which results in the creation of multiple tensors in memory.This was for educational purposes.
/* Calling multiple fit functions of one neural network class in one program is a bad Idea, too many objects will be left undeleted from memory, the best Idea would be to delete each instance of a class and call it again after each fit fuction. */ nn.fit(x_train, y_train, new OptimizerMinBGD(nn_learning_rate), nn_epochs, nn_batch_size, show_batch); nn.fit(x_train, y_train, new OptimizerRMSprop(nn_learning_rate, 0.1), nn_epochs, nn_batch_size, show_batch); nn.fit(x_train, y_train, new OptimizerAdaGrad(nn_learning_rate), nn_epochs, nn_batch_size, show_batch); nn.fit(x_train, y_train, new OptimizerAdam(nn_learning_rate), nn_epochs, nn_batch_size, show_batch); nn.fit(x_train, y_train, new OptimizerAdaDelta(nn_learning_rate), nn_epochs, nn_batch_size, show_batch); nn.fit(x_train, y_train, new OptimizerNadam(nn_learning_rate), nn_epochs, nn_batch_size, show_batch); delete nn;
It should be like this;
//--- nn = new CRegressorNets(hidden_layers, AF_RELU_, LOSS_MSE_); x_train = scaler.fit_transform(x_train); nn.fit(x_train, y_train, new OptimizerMinBGD(nn_learning_rate), nn_epochs, nn_batch_size, show_batch); delete nn; //--- nn = new CRegressorNets(hidden_layers, AF_RELU_, LOSS_MSE_); x_train = scaler.fit_transform(x_train); nn.fit(x_train, y_train, new OptimizerAdam(nn_learning_rate), nn_epochs, nn_batch_size, show_batch); delete nn;

- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use
Check out the new article: Data Science and Machine Learning (Part 21): Unlocking Neural Networks, Optimization algorithms demystified.
Dive into the heart of neural networks as we demystify the optimization algorithms used inside the neural network. In this article, discover the key techniques that unlock the full potential of neural networks, propelling your models to new heights of accuracy and efficiency.
It seems like everybody nowadays is interested in Artificial Intelligence, it's everywhere, and the big guys in the tech industry such as Google and Microsoft behind openAI are pushing for AI adaptation in different aspects and industries such as entertainment, healthcare industry, arts, creativity, etc.
I see this trend also in the MQL5 community why not, with the introduction of matrices and vectors and ONNX to Metatrader5, It is now possible to make Artificial intelligence trading models of any complexity, you don't even need to be an expert in Linear algebra or a nerd enough to understand everything that goes into the system.
Despite all that, the fundamentals of machine learning are now more difficult to find than ever, yet they are as important knowing to solidify your understanding of AI, they let you know why you do what you do which makes you flexible and lets you exercise your options, There is a lot of things we are yet to discuss on machine learning. Today we'll see what are the optimization algorithms, how they fare against one another, when and which optimization algorithm you should choose for a better performance and accuracy to your Neural networks.
Author: Omega J Msigwa