You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Check these links for advance forecasting :https://otexts.com/fpp3/toolbox.html
https://skforecast.org/0.13.0/introduction-forecasting/introduction-forecasting
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.12.11 14:48
Neural Networks Made Easy (Part 95): Reducing Memory Consumption in Transformer Models
The introduction of the Transformer architecture back in 2017 led to the emergence of Large Language Models (LLMs), which demonstrate high results in solving natural language processing problems. Quite soon the advantages of Self-Attention approaches have been adopted by researchers in virtually every area of machine learning.
However, due to its autoregressive nature, the Transformer Decoder is limited by the memory bandwidth used to load and store the Key and Value entities at each time step (known as KV caching). Since this cache scales linearly with model size, batch size, and context length, it can even exceed the memory usage of the model weights.
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.12.13 08:30
Neural Network in Practice: Pseudoinverse (I)

In the previous article "Neural network in practice: Straight Line Function", we were talking about how algebraic equations can be used to determine part of the information we are looking for. This is necessary in order to formulate an equation, which in our particular case is the equation of a straight line, since our small set of data can actually be expressed as a straight line. All the material related to explaining how neural networks work is not easy to present without understanding the level of knowledge of mathematics of each reader.Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.12.21 16:42
Neural Networks Made Easy (Part 96): Multi-Scale Feature Extraction (MSFformer)
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2025.01.14 06:41
Neural Networks Made Easy (Part 97): Training Models With MSFformer
Don't you think MQL5 it too "limited" for Neural Networks? I've made hundreds of EAs but could never make anything that required a lot of complexity. I've been working on a model but I decided to use Python.
I've been using SAC (Soft Actor-Critic), a model of Reinforced Learning using stable-baselines3 (https://stable-baselines3.readthedocs.io/en/master/).
I use a A Convolutional Neural Network (CNN) with conv2d (better of time series data), but I'm still struggling a little to make the training stable and fast enough.
After I'm satisfied with the model I'll run validation on another set of data (not used on traning) for hypermarameter optimization, probably using a bayesian method (like Optuna https://optuna.org/).
Finally I'll test the model using the select hyperparameters on another set of data (not using on training or validation). Then put it on demo account and at last run it on a real account (if everything goes as planned).
Anyone following a similar path to exchange some ideas?
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2025.01.16 06:44
Neural Networks in Trading: Piecewise Linear Representation of Time Series
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2025.01.18 03:44
Neural Networks in Trading: Dual-Attention-Based Trend Prediction Model
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2025.01.18 03:51
Neural Network in Practice: Pseudoinverse (II)
Don't you think MQL5 it too "limited" for Neural Networks? I've made hundreds of EAs but could never make anything that required a lot of complexity. I've been working on a model but I decided to use Python.
I've been using SAC (Soft Actor-Critic), a model of Reinforced Learning using stable-baselines3 (https://stable-baselines3.readthedocs.io/en/master/).
I use a A Convolutional Neural Network (CNN) with conv2d (better of time series data), but I'm still struggling a little to make the training stable and fast enough.
After I'm satisfied with the model I'll run validation on another set of data (not used on traning) for hypermarameter optimization, probably using a bayesian method (like Optuna https://optuna.org/).
Finally I'll test the model using the select hyperparameters on another set of data (not using on training or validation). Then put it on demo account and at last run it on a real account (if everything goes as planned).
Anyone following a similar path to exchange some ideas?
Hi you could import the onnx model on mql5 after you are done.
Why are you using 3 sets ?